Dec 03 14:13:12 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 14:13:12 crc restorecon[4697]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:13:12 crc restorecon[4697]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 14:13:13 crc kubenswrapper[4751]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 14:13:13 crc kubenswrapper[4751]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 14:13:13 crc kubenswrapper[4751]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 14:13:13 crc kubenswrapper[4751]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 14:13:13 crc kubenswrapper[4751]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 14:13:13 crc kubenswrapper[4751]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.158382 4751 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161735 4751 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161754 4751 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161761 4751 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161766 4751 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161772 4751 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161778 4751 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161784 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161789 4751 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161795 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161800 4751 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161805 4751 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161829 4751 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161835 4751 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161842 4751 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161850 4751 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161856 4751 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161861 4751 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161866 4751 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161871 4751 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161876 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161881 4751 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161886 4751 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161891 4751 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161897 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161920 4751 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161927 4751 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161932 4751 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161937 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161941 4751 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161946 4751 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161951 4751 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161956 4751 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161960 4751 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161967 4751 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161975 4751 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161981 4751 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161987 4751 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161993 4751 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.161998 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162003 4751 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162008 4751 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162043 4751 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162049 4751 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162055 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162060 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162065 4751 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162069 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162075 4751 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162080 4751 feature_gate.go:330] unrecognized feature gate: Example Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162085 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162090 4751 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162094 4751 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162099 4751 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162104 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162109 4751 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162114 4751 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162120 4751 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162125 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162130 4751 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162137 4751 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162142 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162146 4751 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162151 4751 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162156 4751 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162160 4751 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162165 4751 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162170 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162175 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162179 4751 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162186 4751 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.162192 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162426 4751 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162440 4751 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162449 4751 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162458 4751 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162466 4751 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162472 4751 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162480 4751 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162487 4751 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162493 4751 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162499 4751 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162505 4751 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162511 4751 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162518 4751 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162524 4751 flags.go:64] FLAG: --cgroup-root="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162529 4751 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162535 4751 flags.go:64] FLAG: --client-ca-file="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162541 4751 flags.go:64] FLAG: --cloud-config="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162546 4751 flags.go:64] FLAG: --cloud-provider="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162551 4751 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162563 4751 flags.go:64] FLAG: --cluster-domain="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162569 4751 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162575 4751 flags.go:64] FLAG: --config-dir="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162581 4751 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162587 4751 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162600 4751 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162607 4751 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162614 4751 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162620 4751 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162627 4751 flags.go:64] FLAG: --contention-profiling="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162633 4751 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162639 4751 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162644 4751 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162650 4751 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162657 4751 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162663 4751 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162669 4751 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162674 4751 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162680 4751 flags.go:64] FLAG: --enable-server="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162685 4751 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162692 4751 flags.go:64] FLAG: --event-burst="100" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162698 4751 flags.go:64] FLAG: --event-qps="50" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162704 4751 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162709 4751 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162715 4751 flags.go:64] FLAG: --eviction-hard="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162722 4751 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162727 4751 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162733 4751 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162738 4751 flags.go:64] FLAG: --eviction-soft="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162745 4751 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162750 4751 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162756 4751 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162761 4751 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162767 4751 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162773 4751 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162779 4751 flags.go:64] FLAG: --feature-gates="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162786 4751 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162791 4751 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162798 4751 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162803 4751 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162809 4751 flags.go:64] FLAG: --healthz-port="10248" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162815 4751 flags.go:64] FLAG: --help="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162820 4751 flags.go:64] FLAG: --hostname-override="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162826 4751 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162832 4751 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162837 4751 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162842 4751 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162848 4751 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162854 4751 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162859 4751 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162864 4751 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162870 4751 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162875 4751 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162881 4751 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162886 4751 flags.go:64] FLAG: --kube-reserved="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162892 4751 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162897 4751 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162903 4751 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162908 4751 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162915 4751 flags.go:64] FLAG: --lock-file="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162920 4751 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162926 4751 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162932 4751 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162941 4751 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162947 4751 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162954 4751 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162960 4751 flags.go:64] FLAG: --logging-format="text" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162966 4751 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162972 4751 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162978 4751 flags.go:64] FLAG: --manifest-url="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162983 4751 flags.go:64] FLAG: --manifest-url-header="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162991 4751 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.162997 4751 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163004 4751 flags.go:64] FLAG: --max-pods="110" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163010 4751 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163016 4751 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163021 4751 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163026 4751 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163032 4751 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163037 4751 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163043 4751 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163057 4751 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163063 4751 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163069 4751 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163075 4751 flags.go:64] FLAG: --pod-cidr="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163081 4751 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163088 4751 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163093 4751 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163099 4751 flags.go:64] FLAG: --pods-per-core="0" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163105 4751 flags.go:64] FLAG: --port="10250" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163111 4751 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163116 4751 flags.go:64] FLAG: --provider-id="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163121 4751 flags.go:64] FLAG: --qos-reserved="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163127 4751 flags.go:64] FLAG: --read-only-port="10255" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163132 4751 flags.go:64] FLAG: --register-node="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163142 4751 flags.go:64] FLAG: --register-schedulable="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163147 4751 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163162 4751 flags.go:64] FLAG: --registry-burst="10" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163168 4751 flags.go:64] FLAG: --registry-qps="5" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163174 4751 flags.go:64] FLAG: --reserved-cpus="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163180 4751 flags.go:64] FLAG: --reserved-memory="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163187 4751 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163193 4751 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163199 4751 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163205 4751 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163210 4751 flags.go:64] FLAG: --runonce="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163216 4751 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163221 4751 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163227 4751 flags.go:64] FLAG: --seccomp-default="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163233 4751 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163238 4751 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163244 4751 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163250 4751 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163256 4751 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163261 4751 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163267 4751 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163272 4751 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163278 4751 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163284 4751 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163289 4751 flags.go:64] FLAG: --system-cgroups="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163294 4751 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163303 4751 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163309 4751 flags.go:64] FLAG: --tls-cert-file="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163314 4751 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163322 4751 flags.go:64] FLAG: --tls-min-version="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163350 4751 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163356 4751 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163364 4751 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163370 4751 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163376 4751 flags.go:64] FLAG: --v="2" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163384 4751 flags.go:64] FLAG: --version="false" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163392 4751 flags.go:64] FLAG: --vmodule="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163399 4751 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163405 4751 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163532 4751 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163538 4751 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163544 4751 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163550 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163555 4751 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163560 4751 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163565 4751 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163570 4751 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163575 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163579 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163584 4751 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163589 4751 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163594 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163599 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163604 4751 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163608 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163613 4751 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163618 4751 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163623 4751 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163628 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163633 4751 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163638 4751 feature_gate.go:330] unrecognized feature gate: Example Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163643 4751 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163648 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163653 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163659 4751 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163664 4751 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163669 4751 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163674 4751 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163679 4751 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163683 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163688 4751 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163693 4751 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163699 4751 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163705 4751 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163711 4751 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163717 4751 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163722 4751 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163727 4751 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163737 4751 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163744 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163750 4751 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163756 4751 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163762 4751 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163767 4751 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163773 4751 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163781 4751 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163788 4751 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163796 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163802 4751 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163809 4751 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163815 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163820 4751 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163826 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163831 4751 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163837 4751 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163843 4751 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163854 4751 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163861 4751 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163868 4751 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163874 4751 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163880 4751 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163886 4751 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163892 4751 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163898 4751 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163904 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163910 4751 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163917 4751 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163924 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163930 4751 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.163937 4751 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.163954 4751 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.173466 4751 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.173520 4751 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173599 4751 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173608 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173612 4751 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173616 4751 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173621 4751 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173625 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173628 4751 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173632 4751 feature_gate.go:330] unrecognized feature gate: Example Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173636 4751 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173640 4751 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173644 4751 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173649 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173655 4751 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173663 4751 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173667 4751 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173672 4751 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173675 4751 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173680 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173683 4751 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173687 4751 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173691 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173694 4751 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173698 4751 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173701 4751 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173705 4751 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173709 4751 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173713 4751 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173716 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173720 4751 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173723 4751 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173728 4751 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173732 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173736 4751 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173740 4751 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173744 4751 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173747 4751 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173751 4751 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173755 4751 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173758 4751 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173762 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173765 4751 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173769 4751 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173773 4751 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173776 4751 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173779 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173785 4751 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173789 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173792 4751 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173796 4751 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173800 4751 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173803 4751 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173807 4751 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173811 4751 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173815 4751 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173819 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173822 4751 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173826 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173829 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173833 4751 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173836 4751 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173840 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173843 4751 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173846 4751 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173850 4751 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173855 4751 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173859 4751 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173864 4751 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173870 4751 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173874 4751 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173878 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.173882 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.173889 4751 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174109 4751 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174119 4751 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174123 4751 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174128 4751 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174134 4751 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174140 4751 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174144 4751 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174148 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174151 4751 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174155 4751 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174158 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174163 4751 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174168 4751 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174171 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174175 4751 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174178 4751 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174182 4751 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174185 4751 feature_gate.go:330] unrecognized feature gate: Example Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174189 4751 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174192 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174196 4751 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174199 4751 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174203 4751 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174206 4751 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174210 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174214 4751 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174217 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174221 4751 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174225 4751 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174228 4751 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174232 4751 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174235 4751 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174240 4751 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174244 4751 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174248 4751 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174252 4751 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174257 4751 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174263 4751 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174267 4751 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174270 4751 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174274 4751 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174278 4751 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174282 4751 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174285 4751 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174290 4751 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174294 4751 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174297 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174301 4751 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174305 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174309 4751 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174312 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174316 4751 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174320 4751 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174323 4751 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174348 4751 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174352 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174357 4751 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174362 4751 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174366 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174369 4751 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174373 4751 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174377 4751 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174381 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174384 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174388 4751 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174392 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174396 4751 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174399 4751 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174403 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174407 4751 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.174411 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.174418 4751 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.174790 4751 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.177697 4751 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.177802 4751 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.178558 4751 server.go:997] "Starting client certificate rotation" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.178596 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.178838 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-02 13:53:21.564887781 +0000 UTC Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.178945 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 719h40m8.385947215s for next certificate rotation Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.183946 4751 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.186834 4751 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.195949 4751 log.go:25] "Validated CRI v1 runtime API" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.216113 4751 log.go:25] "Validated CRI v1 image API" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.217563 4751 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.220667 4751 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-14-08-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.220709 4751 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.239597 4751 manager.go:217] Machine: {Timestamp:2025-12-03 14:13:13.237731844 +0000 UTC m=+0.226087101 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b448da91-3150-4278-a353-292ec92ffaef BootID:bd661df5-edc0-463f-8631-fab82404a306 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:db:ad:75 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:db:ad:75 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:8b:2c:b7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c5:e1:53 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:39:0c:c1 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:2b:69:9a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:32:e6:12:8e:fc:aa Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:7f:07:16:80:22 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.239886 4751 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.240103 4751 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.240619 4751 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.240957 4751 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.241124 4751 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.241817 4751 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.241831 4751 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.242845 4751 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.242979 4751 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.243585 4751 state_mem.go:36] "Initialized new in-memory state store" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.244069 4751 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.245257 4751 kubelet.go:418] "Attempting to sync node with API server" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.245282 4751 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.245312 4751 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.245351 4751 kubelet.go:324] "Adding apiserver pod source" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.245366 4751 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.247820 4751 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.247866 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.247949 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.248054 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.248108 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.248246 4751 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249006 4751 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249523 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249547 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249556 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249565 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249578 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249587 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249599 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249612 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249624 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249633 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249645 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.249655 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.250101 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.250529 4751 server.go:1280] "Started kubelet" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.250791 4751 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.250865 4751 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.250779 4751 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.251428 4751 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 14:13:13 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.252893 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.252930 4751 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.252658 4751 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187dba0cd389d675 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 14:13:13.250502261 +0000 UTC m=+0.238857478,LastTimestamp:2025-12-03 14:13:13.250502261 +0000 UTC m=+0.238857478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.253035 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 12:40:20.075806055 +0000 UTC Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.253079 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 382h27m6.822731128s for next certificate rotation Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.253164 4751 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.253177 4751 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.253256 4751 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.253264 4751 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.254037 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.254117 4751 factory.go:55] Registering systemd factory Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.254163 4751 factory.go:221] Registration of the systemd container factory successfully Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.254265 4751 server.go:460] "Adding debug handlers to kubelet server" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.254298 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.254148 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.254547 4751 factory.go:153] Registering CRI-O factory Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.254560 4751 factory.go:221] Registration of the crio container factory successfully Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.254640 4751 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.254662 4751 factory.go:103] Registering Raw factory Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.254677 4751 manager.go:1196] Started watching for new ooms in manager Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.256122 4751 manager.go:319] Starting recovery of all containers Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.276209 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.276765 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.276794 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.276816 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.276909 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.276936 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.276955 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.276976 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277005 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277026 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277044 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277062 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277081 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277103 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277122 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277141 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277159 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277179 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277199 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277217 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277241 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277265 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277284 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277305 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277349 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277371 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277393 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277472 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277582 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277632 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277653 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.277672 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279317 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279379 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279408 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279430 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279451 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279470 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279502 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279523 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279555 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279583 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279605 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279624 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279642 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279663 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279681 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279728 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279748 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279770 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279792 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279814 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279840 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279896 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279921 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279944 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279965 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.279986 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280006 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280046 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280069 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280090 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280110 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280130 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280151 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280176 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280196 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280217 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280236 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280253 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280269 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280291 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280312 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280360 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280382 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280402 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280420 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280438 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280458 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280477 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.280505 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281115 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281148 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281162 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281173 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281184 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281195 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281206 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281219 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281253 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281265 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281276 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281286 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281297 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.281308 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282040 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282062 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282071 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282082 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282123 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282134 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282144 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282154 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282165 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282181 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282192 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282209 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282221 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282232 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282243 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282255 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282265 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282276 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282288 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282301 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282312 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282383 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282395 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282406 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282415 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282426 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282438 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282448 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282460 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282470 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282481 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282491 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282501 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282512 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282525 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282536 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282547 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282560 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282570 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282581 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282592 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282604 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282616 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282629 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282639 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282648 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282659 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282669 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282681 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282691 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282702 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282713 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282725 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282738 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282749 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282761 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282776 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282792 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282803 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282814 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282825 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282835 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282845 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282855 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282867 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282878 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282890 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282902 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282917 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282928 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.282940 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283433 4751 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283462 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283476 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283487 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283499 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283510 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283521 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283531 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283541 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283551 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283562 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283572 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283582 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283592 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283601 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283610 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283620 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283629 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283641 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283651 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283660 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283669 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283680 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283690 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283700 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283710 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283722 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283734 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283743 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283752 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283763 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283774 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283783 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283793 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283803 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283812 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283822 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283832 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283841 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283851 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283860 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283869 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283877 4751 reconstruct.go:97] "Volume reconstruction finished" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.283884 4751 reconciler.go:26] "Reconciler: start to sync state" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.287681 4751 manager.go:324] Recovery completed Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.305153 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.308940 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.309004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.309023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.309781 4751 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.310748 4751 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.310789 4751 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.310825 4751 state_mem.go:36] "Initialized new in-memory state store" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.312615 4751 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.312670 4751 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.312704 4751 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.312757 4751 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.314272 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.314404 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.321575 4751 policy_none.go:49] "None policy: Start" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.322381 4751 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.322470 4751 state_mem.go:35] "Initializing new in-memory state store" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.353723 4751 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.376602 4751 manager.go:334] "Starting Device Plugin manager" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.376659 4751 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.376676 4751 server.go:79] "Starting device plugin registration server" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.377162 4751 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.377186 4751 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.377643 4751 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.377823 4751 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.377833 4751 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.387244 4751 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.413154 4751 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.413403 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.415059 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.415093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.415104 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.415231 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.415672 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.415848 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.415878 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.415900 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.415931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.416026 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.416232 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.416297 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.417011 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.417052 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.417069 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.417203 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.417308 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.417368 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.417637 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.417827 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.417908 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418027 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418048 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418057 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418069 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418074 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418235 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418384 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418428 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418651 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.418720 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.419031 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.419078 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.419088 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.419126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.419146 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.419156 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.419363 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.419401 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.420198 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.420229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.420241 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.454944 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.477483 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.478941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.478993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.479004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.479036 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.479619 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.486367 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.486412 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.486521 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.486549 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.486574 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.486595 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.486669 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.486793 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.487000 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.487101 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.487165 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.487205 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.487241 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.487274 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.487301 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588726 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588793 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588817 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588839 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588863 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588883 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588888 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588897 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588905 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588970 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588976 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588977 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588966 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589034 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589061 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.588964 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589103 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589069 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589104 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589082 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589219 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589258 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589283 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589304 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589352 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589367 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589384 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589389 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589412 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.589480 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.680766 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.682091 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.682136 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.682148 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.682173 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.682733 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.753220 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.770004 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-664e6037295a14813a4de26149ce1d80642ab54aabf776cee701e8ca5a8dabf8 WatchSource:0}: Error finding container 664e6037295a14813a4de26149ce1d80642ab54aabf776cee701e8ca5a8dabf8: Status 404 returned error can't find the container with id 664e6037295a14813a4de26149ce1d80642ab54aabf776cee701e8ca5a8dabf8 Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.773192 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.787858 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.789918 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b063a50e772e7216651af4423743a1854c751db5906f3f369d1936f14267ef70 WatchSource:0}: Error finding container b063a50e772e7216651af4423743a1854c751db5906f3f369d1936f14267ef70: Status 404 returned error can't find the container with id b063a50e772e7216651af4423743a1854c751db5906f3f369d1936f14267ef70 Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.796667 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-500654ec1984c6fa8b75126c885af8437cc20763a5028663c436bb946a8ef17c WatchSource:0}: Error finding container 500654ec1984c6fa8b75126c885af8437cc20763a5028663c436bb946a8ef17c: Status 404 returned error can't find the container with id 500654ec1984c6fa8b75126c885af8437cc20763a5028663c436bb946a8ef17c Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.813377 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.823561 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f315e243db2e184e138fdbb9b8bae83e973cfab646f5893b65bb0adc1a17d51b WatchSource:0}: Error finding container f315e243db2e184e138fdbb9b8bae83e973cfab646f5893b65bb0adc1a17d51b: Status 404 returned error can't find the container with id f315e243db2e184e138fdbb9b8bae83e973cfab646f5893b65bb0adc1a17d51b Dec 03 14:13:13 crc kubenswrapper[4751]: I1203 14:13:13.827562 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:13 crc kubenswrapper[4751]: W1203 14:13:13.843304 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c75e23bbc1e19aa564ff9b628a0cadb32c16f94660ff76a57052fa6b1904b43c WatchSource:0}: Error finding container c75e23bbc1e19aa564ff9b628a0cadb32c16f94660ff76a57052fa6b1904b43c: Status 404 returned error can't find the container with id c75e23bbc1e19aa564ff9b628a0cadb32c16f94660ff76a57052fa6b1904b43c Dec 03 14:13:13 crc kubenswrapper[4751]: E1203 14:13:13.855512 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.083597 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.084950 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.084981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.084989 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.085012 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:13:14 crc kubenswrapper[4751]: E1203 14:13:14.085398 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.252158 4751 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.318320 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f" exitCode=0 Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.318398 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f"} Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.318469 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c75e23bbc1e19aa564ff9b628a0cadb32c16f94660ff76a57052fa6b1904b43c"} Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.318548 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.319251 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.319272 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.319280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.319625 4751 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3" exitCode=0 Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.319674 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3"} Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.319688 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f315e243db2e184e138fdbb9b8bae83e973cfab646f5893b65bb0adc1a17d51b"} Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.319767 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.320369 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.320413 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.320448 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.320460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.320840 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.320866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.320875 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.321048 4751 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5f2337f6883ef24084bfd3e6fac8e9bf3ff282d60aceb0ac325aca587cc4953d" exitCode=0 Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.321129 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5f2337f6883ef24084bfd3e6fac8e9bf3ff282d60aceb0ac325aca587cc4953d"} Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.321161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"500654ec1984c6fa8b75126c885af8437cc20763a5028663c436bb946a8ef17c"} Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.321242 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.321929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.321958 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.321969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.323256 4751 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712" exitCode=0 Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.323321 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712"} Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.323367 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b063a50e772e7216651af4423743a1854c751db5906f3f369d1936f14267ef70"} Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.323440 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.324290 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.324312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.324321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.325051 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a"} Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.325075 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"664e6037295a14813a4de26149ce1d80642ab54aabf776cee701e8ca5a8dabf8"} Dec 03 14:13:14 crc kubenswrapper[4751]: W1203 14:13:14.348580 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:14 crc kubenswrapper[4751]: E1203 14:13:14.348677 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:13:14 crc kubenswrapper[4751]: W1203 14:13:14.388234 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:14 crc kubenswrapper[4751]: E1203 14:13:14.388340 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:13:14 crc kubenswrapper[4751]: W1203 14:13:14.434568 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:14 crc kubenswrapper[4751]: E1203 14:13:14.434653 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:13:14 crc kubenswrapper[4751]: W1203 14:13:14.522635 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 03 14:13:14 crc kubenswrapper[4751]: E1203 14:13:14.522739 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:13:14 crc kubenswrapper[4751]: E1203 14:13:14.656689 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.886074 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.887810 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.887853 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.887867 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:14 crc kubenswrapper[4751]: I1203 14:13:14.887900 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:13:14 crc kubenswrapper[4751]: E1203 14:13:14.888488 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.329429 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091"} Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.329485 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4"} Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.331748 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479"} Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.331798 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc"} Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.334101 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38"} Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.334129 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30"} Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.336209 4751 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7" exitCode=0 Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.336299 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7"} Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.336485 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.337965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.338002 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.338013 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.339651 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dd748cc363ad0792250139554344aae2d5ec6765a783614b02301d23a050afaf"} Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.339737 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.340525 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.340548 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:15 crc kubenswrapper[4751]: I1203 14:13:15.340557 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.349578 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0"} Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.349810 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.351294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.351316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.351338 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.354658 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3"} Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.354692 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2"} Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.354700 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.354707 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459"} Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.355294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.355319 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.355349 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.357042 4751 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7" exitCode=0 Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.357089 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7"} Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.357174 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.357751 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.357770 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.357778 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.359553 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4"} Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.359695 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.360806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.360852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.360869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.488902 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.489930 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.489964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.489972 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.489995 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:13:16 crc kubenswrapper[4751]: I1203 14:13:16.956879 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.370063 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4"} Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.370133 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883"} Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.370139 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.370211 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.370232 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.370148 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8"} Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.370350 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4"} Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.370366 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3"} Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.370354 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.371563 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.371846 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.371869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.371881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.371931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.371981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.371997 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.372013 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.372029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.372013 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.373291 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.373321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:17 crc kubenswrapper[4751]: I1203 14:13:17.373349 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.123629 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.373211 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.373219 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.373298 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.378990 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.379036 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.379053 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.379180 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.379196 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.379211 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.379369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.379389 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.379397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.465811 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.598178 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:18 crc kubenswrapper[4751]: I1203 14:13:18.961062 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.317933 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.375517 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.375644 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.375722 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.376942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.376942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.377024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.376942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.377200 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.377230 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.377054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.376977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.377376 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:19 crc kubenswrapper[4751]: I1203 14:13:19.646040 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:20 crc kubenswrapper[4751]: I1203 14:13:20.378052 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:20 crc kubenswrapper[4751]: I1203 14:13:20.378059 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:20 crc kubenswrapper[4751]: I1203 14:13:20.378993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:20 crc kubenswrapper[4751]: I1203 14:13:20.379019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:20 crc kubenswrapper[4751]: I1203 14:13:20.379028 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:20 crc kubenswrapper[4751]: I1203 14:13:20.379005 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:20 crc kubenswrapper[4751]: I1203 14:13:20.379100 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:20 crc kubenswrapper[4751]: I1203 14:13:20.379119 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:21 crc kubenswrapper[4751]: I1203 14:13:21.124721 4751 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:13:21 crc kubenswrapper[4751]: I1203 14:13:21.125104 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:13:23 crc kubenswrapper[4751]: E1203 14:13:23.387395 4751 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 14:13:23 crc kubenswrapper[4751]: I1203 14:13:23.632876 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:13:23 crc kubenswrapper[4751]: I1203 14:13:23.633079 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:23 crc kubenswrapper[4751]: I1203 14:13:23.634195 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:23 crc kubenswrapper[4751]: I1203 14:13:23.634222 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:23 crc kubenswrapper[4751]: I1203 14:13:23.634233 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.164409 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.164578 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.165764 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.165825 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.165862 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.252701 4751 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.942364 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.942435 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.947847 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 14:13:25 crc kubenswrapper[4751]: I1203 14:13:25.947898 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.121822 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.121986 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.123121 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.123160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.123169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.125692 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.390908 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.391899 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.391988 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.392006 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.399589 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.957842 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 14:13:26 crc kubenswrapper[4751]: I1203 14:13:26.957964 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 14:13:27 crc kubenswrapper[4751]: I1203 14:13:27.392947 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:27 crc kubenswrapper[4751]: I1203 14:13:27.393993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:27 crc kubenswrapper[4751]: I1203 14:13:27.394022 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:27 crc kubenswrapper[4751]: I1203 14:13:27.394035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:28 crc kubenswrapper[4751]: I1203 14:13:28.605969 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:28 crc kubenswrapper[4751]: I1203 14:13:28.606154 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:28 crc kubenswrapper[4751]: I1203 14:13:28.606613 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 14:13:28 crc kubenswrapper[4751]: I1203 14:13:28.606671 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 14:13:28 crc kubenswrapper[4751]: I1203 14:13:28.607132 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:28 crc kubenswrapper[4751]: I1203 14:13:28.607186 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:28 crc kubenswrapper[4751]: I1203 14:13:28.607207 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:28 crc kubenswrapper[4751]: I1203 14:13:28.610865 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:29 crc kubenswrapper[4751]: I1203 14:13:29.397510 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:29 crc kubenswrapper[4751]: I1203 14:13:29.398040 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 14:13:29 crc kubenswrapper[4751]: I1203 14:13:29.398108 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 14:13:29 crc kubenswrapper[4751]: I1203 14:13:29.398497 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:29 crc kubenswrapper[4751]: I1203 14:13:29.398566 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:29 crc kubenswrapper[4751]: I1203 14:13:29.398587 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.204823 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.204910 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 14:13:30 crc kubenswrapper[4751]: E1203 14:13:30.947635 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.949831 4751 trace.go:236] Trace[1705266464]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 14:13:17.035) (total time: 13913ms): Dec 03 14:13:30 crc kubenswrapper[4751]: Trace[1705266464]: ---"Objects listed" error: 13913ms (14:13:30.949) Dec 03 14:13:30 crc kubenswrapper[4751]: Trace[1705266464]: [13.913837652s] [13.913837652s] END Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.949859 4751 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.951444 4751 trace.go:236] Trace[1681881247]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 14:13:16.418) (total time: 14532ms): Dec 03 14:13:30 crc kubenswrapper[4751]: Trace[1681881247]: ---"Objects listed" error: 14532ms (14:13:30.951) Dec 03 14:13:30 crc kubenswrapper[4751]: Trace[1681881247]: [14.532682846s] [14.532682846s] END Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.951475 4751 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.951500 4751 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.951643 4751 trace.go:236] Trace[248866688]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 14:13:17.282) (total time: 13668ms): Dec 03 14:13:30 crc kubenswrapper[4751]: Trace[248866688]: ---"Objects listed" error: 13668ms (14:13:30.951) Dec 03 14:13:30 crc kubenswrapper[4751]: Trace[248866688]: [13.668686527s] [13.668686527s] END Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.951661 4751 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 14:13:30 crc kubenswrapper[4751]: E1203 14:13:30.952852 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.952918 4751 trace.go:236] Trace[851309446]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 14:13:16.915) (total time: 14037ms): Dec 03 14:13:30 crc kubenswrapper[4751]: Trace[851309446]: ---"Objects listed" error: 14037ms (14:13:30.952) Dec 03 14:13:30 crc kubenswrapper[4751]: Trace[851309446]: [14.037728515s] [14.037728515s] END Dec 03 14:13:30 crc kubenswrapper[4751]: I1203 14:13:30.952947 4751 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.124938 4751 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.125033 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.254616 4751 apiserver.go:52] "Watching apiserver" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.257456 4751 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.257712 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.258030 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.258112 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.258135 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.258216 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.258238 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.258280 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.258636 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.258638 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.258943 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.260276 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.260634 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.260757 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.260955 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.261086 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.261263 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.261474 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.261785 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.262555 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.286090 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.316643 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.349838 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.354352 4751 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.361266 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.370442 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.378894 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.388037 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.453975 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454036 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454060 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454082 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454108 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454140 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454164 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454185 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454208 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454229 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454248 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454266 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454282 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454299 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454316 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454365 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454384 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454408 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454433 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454457 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454484 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454510 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454535 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454558 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454580 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454603 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454622 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454641 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454663 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454688 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454708 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454731 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454763 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454787 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454849 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454872 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454895 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454918 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454941 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454964 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454990 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455014 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455038 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455063 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455094 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455119 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455142 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455167 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454367 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454367 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454635 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.454769 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455014 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455013 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455046 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455064 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455106 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455160 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455375 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455424 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455498 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455555 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455609 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455623 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455659 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455673 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455750 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455773 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455834 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455857 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455872 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455881 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455189 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455918 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455966 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455995 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.455997 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456059 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456004 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456016 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456025 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456016 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456153 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456172 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456178 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456239 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456269 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456295 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456343 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456376 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456400 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456425 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456451 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456477 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.457117 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.457148 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.457175 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456192 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456218 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456315 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456632 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456657 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456686 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456937 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456975 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456992 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.456997 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.457008 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.457208 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.457594 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458399 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458442 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458483 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458587 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458613 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458636 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458659 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458683 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458707 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458732 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458757 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458780 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458827 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458854 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458891 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459075 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459103 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459133 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459159 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459183 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459206 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459229 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459255 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459282 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459304 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459345 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459369 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459393 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.460409 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.460441 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.460457 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458558 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.458782 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459568 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459575 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459681 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.459699 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.460519 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.460653 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.460974 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461056 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461140 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461286 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461342 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461447 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461479 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.460473 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461507 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461523 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461549 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461576 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461600 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461603 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461631 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461658 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461705 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461729 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461752 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461774 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461773 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461795 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461819 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461838 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461857 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461879 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461900 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461921 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461941 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461963 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461986 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462008 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462028 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462044 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462062 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462083 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462102 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462122 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462140 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462158 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462177 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462201 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462222 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462241 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462263 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462281 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462300 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462345 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462487 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462518 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462542 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462565 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462584 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462605 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462631 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462654 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462676 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462700 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462720 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462742 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462765 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462812 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462872 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462894 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462917 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462942 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462963 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462983 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463014 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463036 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463062 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463083 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463102 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463124 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463144 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463167 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463186 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463256 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463282 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463304 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463340 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463366 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463389 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463412 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463433 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463455 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463475 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463494 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463515 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463536 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.461794 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462041 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462101 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462122 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462193 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462228 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462230 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.464715 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.464790 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462460 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462502 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462510 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462754 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462778 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462884 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462930 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462483 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462099 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.462965 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463100 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463209 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463345 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.463561 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:13:31.963542996 +0000 UTC m=+18.951898213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.464960 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.464998 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465020 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465047 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465070 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465098 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465120 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465142 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465154 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465165 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465190 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465676 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465701 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465723 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465745 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465766 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465786 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465807 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465830 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465854 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465879 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465928 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465954 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465983 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466005 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466028 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466055 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466088 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466112 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466136 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466162 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466188 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466215 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466239 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466260 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466400 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466419 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466431 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466445 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466458 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466470 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466483 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466495 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466508 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466520 4751 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466535 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466547 4751 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466560 4751 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466572 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466583 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466593 4751 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466605 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466618 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466630 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466641 4751 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466653 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466666 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466680 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466693 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466706 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466720 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466732 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466742 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466753 4751 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466765 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466803 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466818 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466830 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466841 4751 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466855 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466868 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466881 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466892 4751 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466905 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466917 4751 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466930 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466942 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466953 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466964 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466978 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466989 4751 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467001 4751 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467014 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467026 4751 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467037 4751 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467052 4751 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467065 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467076 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467087 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467099 4751 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467112 4751 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467124 4751 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467138 4751 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467150 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467165 4751 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467180 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467192 4751 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467205 4751 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467218 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467230 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465161 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465285 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463659 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463800 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463999 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.464510 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465569 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465605 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465622 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465633 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465743 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465848 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.465702 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466051 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466208 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466315 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466360 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.466437 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.468588 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.468643 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:31.9686205 +0000 UTC m=+18.956975717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.468706 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.468857 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.468853 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.469108 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.469127 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.469180 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.469225 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.469310 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.469460 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466932 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466976 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.463575 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.466988 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467032 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467094 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.468888 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467364 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467422 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467454 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467999 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.468019 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.468129 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.468071 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.468160 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.469541 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.469646 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.469991 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.470135 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.470432 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.470657 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.470814 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.470840 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.470931 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.470894 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.470984 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.471119 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.471170 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.471258 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:31.971227398 +0000 UTC m=+18.959582615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.471270 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.471615 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.467442 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.471715 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.472883 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.472884 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.474570 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.474214 4751 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.474648 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.483305 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.484690 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.484946 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.484830 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.485117 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.485209 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.485381 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.485957 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.486238 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.486261 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.486272 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.486350 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:31.986315404 +0000 UTC m=+18.974670621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.486590 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.486735 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.486725 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.487074 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.487432 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.487643 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.487776 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.488047 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.488081 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.488102 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.488121 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.488371 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.488386 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.488636 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:31.988610664 +0000 UTC m=+18.976965881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.492793 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.492985 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493018 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493069 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493086 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493134 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493449 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493470 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493510 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493535 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493590 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493603 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493618 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493675 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493842 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.493888 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.494066 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.494148 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.494252 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.495630 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.496341 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.496393 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.496453 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.496506 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.496757 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.496938 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.496967 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.497453 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.497578 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.497894 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.497942 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.499568 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.510359 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.518844 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.519522 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568442 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568495 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568520 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568548 4751 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568559 4751 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568569 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568577 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568587 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568595 4751 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568604 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568612 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568620 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568628 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568637 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568644 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568653 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568661 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568670 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568679 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568687 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568694 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568703 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568711 4751 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568719 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568727 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568735 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568743 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568751 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568760 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568667 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568768 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568802 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568813 4751 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568825 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568835 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568845 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568854 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568862 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568871 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568881 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568891 4751 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568899 4751 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568908 4751 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568916 4751 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568925 4751 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568963 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568972 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568980 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568989 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.568997 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569006 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569016 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569024 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569033 4751 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569041 4751 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569050 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569058 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569068 4751 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569080 4751 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569089 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569098 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569106 4751 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569115 4751 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569124 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569133 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569143 4751 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569151 4751 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569162 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569171 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569180 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569189 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569198 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569207 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569215 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569225 4751 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569234 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569242 4751 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569250 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569258 4751 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569266 4751 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569275 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569284 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569292 4751 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569300 4751 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569308 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569317 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569339 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569349 4751 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569357 4751 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569366 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569374 4751 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569382 4751 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569390 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569399 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569409 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569418 4751 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569427 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569436 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569445 4751 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569453 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569461 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569469 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569477 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569485 4751 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569493 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569507 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569515 4751 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569524 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569532 4751 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569540 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569548 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569557 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569567 4751 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569575 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569584 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569592 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569600 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569608 4751 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569616 4751 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569624 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569633 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569642 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569650 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569659 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569668 4751 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569676 4751 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569684 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569693 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569701 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569710 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569718 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569726 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.569735 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.571404 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.585070 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.599433 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:13:31 crc kubenswrapper[4751]: W1203 14:13:31.604253 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-e8b8c42c2c76ce9e6a2c7d74e3774298ae779df55e1b21736226aacca8f2a208 WatchSource:0}: Error finding container e8b8c42c2c76ce9e6a2c7d74e3774298ae779df55e1b21736226aacca8f2a208: Status 404 returned error can't find the container with id e8b8c42c2c76ce9e6a2c7d74e3774298ae779df55e1b21736226aacca8f2a208 Dec 03 14:13:31 crc kubenswrapper[4751]: W1203 14:13:31.612230 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-64239e74ced9a751e0eea5be2604c60ae665446561f64d74758b16413d2313b7 WatchSource:0}: Error finding container 64239e74ced9a751e0eea5be2604c60ae665446561f64d74758b16413d2313b7: Status 404 returned error can't find the container with id 64239e74ced9a751e0eea5be2604c60ae665446561f64d74758b16413d2313b7 Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.973978 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.974093 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.974114 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:13:32.974094969 +0000 UTC m=+19.962450186 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:13:31 crc kubenswrapper[4751]: I1203 14:13:31.974170 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.974189 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.974257 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:32.974223232 +0000 UTC m=+19.962578449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.974302 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:31 crc kubenswrapper[4751]: E1203 14:13:31.974375 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:32.974364266 +0000 UTC m=+19.962719473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.075635 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.075786 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.075836 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.075875 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.075892 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.075974 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:33.075950533 +0000 UTC m=+20.064305930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.076008 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.076062 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.076081 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.076160 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:33.076140798 +0000 UTC m=+20.064496015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.406954 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916"} Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.406997 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cf0c170dc42f68fb13f50edaf353dd0a6dadc37ad3bad4a331877569199f96d4"} Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.409937 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.412281 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3" exitCode=255 Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.412317 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3"} Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.414997 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9"} Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.415089 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b"} Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.415132 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"64239e74ced9a751e0eea5be2604c60ae665446561f64d74758b16413d2313b7"} Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.417114 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e8b8c42c2c76ce9e6a2c7d74e3774298ae779df55e1b21736226aacca8f2a208"} Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.422579 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.423104 4751 scope.go:117] "RemoveContainer" containerID="c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.427466 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.458321 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.503076 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.533193 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.553397 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.570313 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.584114 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.600463 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.618796 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.637809 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.654743 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.668847 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.687018 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:32Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.985111 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.985213 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:32 crc kubenswrapper[4751]: I1203 14:13:32.985255 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.985309 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:13:34.985291523 +0000 UTC m=+21.973646740 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.985376 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.985426 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:34.985415447 +0000 UTC m=+21.973770674 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.985470 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:32 crc kubenswrapper[4751]: E1203 14:13:32.985566 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:34.98554834 +0000 UTC m=+21.973903597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.085918 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.085972 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.086076 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.086091 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.086094 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.086161 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.086188 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.086101 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.086247 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:35.086228942 +0000 UTC m=+22.074584229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.086281 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:35.086268023 +0000 UTC m=+22.074623240 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.313226 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.313284 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.313357 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.313432 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.313478 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:33 crc kubenswrapper[4751]: E1203 14:13:33.313537 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.316484 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.317159 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.317792 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.318415 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.318968 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.319467 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.320060 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.320577 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.321155 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.323109 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.323580 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.324611 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.325242 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.326220 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.326718 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.327217 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.335184 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.335575 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.335908 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.336757 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.337373 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.337784 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.338749 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.339253 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.340519 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.341082 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.342371 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.343052 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.343568 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.344585 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.345182 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.346283 4751 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.346418 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.348194 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.349265 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.349809 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.351478 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.352193 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.353193 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.353972 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.355121 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.355810 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.356902 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.357676 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.358005 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.358764 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.359675 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.360283 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.361257 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.362087 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.362606 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.363554 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.364058 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.365047 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.365765 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.366258 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.369388 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.380590 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.394173 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.405999 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.418653 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.420964 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.422770 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0"} Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.423064 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.437393 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.449488 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.462436 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.484283 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.496940 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.507770 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:33 crc kubenswrapper[4751]: I1203 14:13:33.519162 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.152960 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.155838 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.155893 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.155906 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.155969 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.162174 4751 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.162450 4751 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.163421 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.163443 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.163451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.163464 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.163473 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: E1203 14:13:34.178040 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.180676 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.180703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.180712 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.180725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.180734 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: E1203 14:13:34.191621 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.194222 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.194265 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.194274 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.194285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.194293 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: E1203 14:13:34.204131 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.207436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.207473 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.207490 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.207513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.207531 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: E1203 14:13:34.218515 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.221636 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.221679 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.221691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.221708 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.221722 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: E1203 14:13:34.232082 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: E1203 14:13:34.232240 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.233752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.233783 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.233794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.233829 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.233842 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.336384 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.336447 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.336463 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.336489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.336507 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.430248 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d"} Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.438176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.438222 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.438234 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.438251 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.438263 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.446904 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.460630 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.475260 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.487067 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.502452 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.516467 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.528286 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:34Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.541299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.541360 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.541369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.541387 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.541397 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.643616 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.643649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.643657 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.643670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.643678 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.746157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.746513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.746653 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.746815 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.746928 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.849776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.850061 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.850121 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.850202 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.850288 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.953137 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.953173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.953191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.953208 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:34 crc kubenswrapper[4751]: I1203 14:13:34.953252 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:34Z","lastTransitionTime":"2025-12-03T14:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.003820 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.003899 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.003928 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.004030 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.004051 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.004096 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:13:39.004062597 +0000 UTC m=+25.992417854 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.004162 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:39.00414395 +0000 UTC m=+25.992499207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.004203 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:39.004187981 +0000 UTC m=+25.992543228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.056006 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.056044 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.056053 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.056069 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.056091 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.105241 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.105383 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.105560 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.105691 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.105714 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.105571 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.105793 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:39.105768787 +0000 UTC m=+26.094124034 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.105811 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.105837 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.105909 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:39.10588455 +0000 UTC m=+26.094239807 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.158072 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.158108 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.158118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.158131 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.158141 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.186761 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.202828 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.204972 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.209371 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.222813 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.235579 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.247354 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.259845 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.260870 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.260912 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.260923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.260939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.260950 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.271318 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.285192 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.299863 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.313418 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.313478 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.313541 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.313617 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.313784 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:35 crc kubenswrapper[4751]: E1203 14:13:35.313869 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.317545 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.331937 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.346374 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.359742 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.363579 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.363635 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.363648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.363680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.363694 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.383041 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.396141 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.408913 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.466840 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.466899 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.466915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.466936 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.466951 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.568933 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.568976 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.568986 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.568999 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.569009 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.671110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.671149 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.671157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.671172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.671180 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.773782 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.773828 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.773839 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.773923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.773943 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.876004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.876049 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.876060 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.876078 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.876091 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.881888 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-g8hvv"] Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.882238 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.888855 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.888897 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.889016 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.893867 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hmjzc"] Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.894149 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hmjzc" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.894901 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.896045 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.897483 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.898104 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.912937 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5059484e-1748-4e33-89c0-3d5e57b97489-serviceca\") pod \"node-ca-g8hvv\" (UID: \"5059484e-1748-4e33-89c0-3d5e57b97489\") " pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.912997 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5059484e-1748-4e33-89c0-3d5e57b97489-host\") pod \"node-ca-g8hvv\" (UID: \"5059484e-1748-4e33-89c0-3d5e57b97489\") " pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.913089 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ljv7\" (UniqueName: \"kubernetes.io/projected/5059484e-1748-4e33-89c0-3d5e57b97489-kube-api-access-7ljv7\") pod \"node-ca-g8hvv\" (UID: \"5059484e-1748-4e33-89c0-3d5e57b97489\") " pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.932273 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.944412 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.955576 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-98mjq"] Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.955929 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-98mjq" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.958206 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.958266 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.958459 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.958964 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.959954 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.968543 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.977976 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.978016 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.978025 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.978038 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:35 crc kubenswrapper[4751]: I1203 14:13:35.978048 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:35Z","lastTransitionTime":"2025-12-03T14:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.000774 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014440 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-os-release\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014482 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-run-netns\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014514 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-etc-kubernetes\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014571 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-cnibin\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014627 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-var-lib-cni-bin\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014644 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-hostroot\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014690 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w565t\" (UniqueName: \"kubernetes.io/projected/6a216adb-632d-4134-8c61-61fe6b8c5f71-kube-api-access-w565t\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014722 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5059484e-1748-4e33-89c0-3d5e57b97489-serviceca\") pod \"node-ca-g8hvv\" (UID: \"5059484e-1748-4e33-89c0-3d5e57b97489\") " pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014740 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-conf-dir\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014775 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hcd6\" (UniqueName: \"kubernetes.io/projected/ba1da9df-c8bd-41d0-bc00-09688e27984e-kube-api-access-6hcd6\") pod \"node-resolver-hmjzc\" (UID: \"ba1da9df-c8bd-41d0-bc00-09688e27984e\") " pod="openshift-dns/node-resolver-hmjzc" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014794 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-system-cni-dir\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014812 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-run-multus-certs\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014831 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5059484e-1748-4e33-89c0-3d5e57b97489-host\") pod \"node-ca-g8hvv\" (UID: \"5059484e-1748-4e33-89c0-3d5e57b97489\") " pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014847 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ljv7\" (UniqueName: \"kubernetes.io/projected/5059484e-1748-4e33-89c0-3d5e57b97489-kube-api-access-7ljv7\") pod \"node-ca-g8hvv\" (UID: \"5059484e-1748-4e33-89c0-3d5e57b97489\") " pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014865 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ba1da9df-c8bd-41d0-bc00-09688e27984e-hosts-file\") pod \"node-resolver-hmjzc\" (UID: \"ba1da9df-c8bd-41d0-bc00-09688e27984e\") " pod="openshift-dns/node-resolver-hmjzc" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014883 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-socket-dir-parent\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014901 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-run-k8s-cni-cncf-io\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014916 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-var-lib-kubelet\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014943 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-cni-dir\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014960 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a216adb-632d-4134-8c61-61fe6b8c5f71-cni-binary-copy\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014975 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-var-lib-cni-multus\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.014991 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-daemon-config\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.015104 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5059484e-1748-4e33-89c0-3d5e57b97489-host\") pod \"node-ca-g8hvv\" (UID: \"5059484e-1748-4e33-89c0-3d5e57b97489\") " pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.015688 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5059484e-1748-4e33-89c0-3d5e57b97489-serviceca\") pod \"node-ca-g8hvv\" (UID: \"5059484e-1748-4e33-89c0-3d5e57b97489\") " pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.027158 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.045369 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.048553 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ljv7\" (UniqueName: \"kubernetes.io/projected/5059484e-1748-4e33-89c0-3d5e57b97489-kube-api-access-7ljv7\") pod \"node-ca-g8hvv\" (UID: \"5059484e-1748-4e33-89c0-3d5e57b97489\") " pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.081522 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.081563 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.081575 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.081591 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.081604 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:36Z","lastTransitionTime":"2025-12-03T14:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.084210 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.103559 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115241 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115356 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-os-release\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115383 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-run-netns\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115400 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-etc-kubernetes\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115438 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-cnibin\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115457 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-var-lib-cni-bin\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115477 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-hostroot\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115496 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w565t\" (UniqueName: \"kubernetes.io/projected/6a216adb-632d-4134-8c61-61fe6b8c5f71-kube-api-access-w565t\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115521 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-conf-dir\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115516 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-etc-kubernetes\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115542 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-run-multus-certs\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115552 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-var-lib-cni-bin\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115522 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-run-netns\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115577 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-run-multus-certs\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115549 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-os-release\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115614 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-hostroot\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115605 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-cnibin\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115581 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-conf-dir\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115723 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hcd6\" (UniqueName: \"kubernetes.io/projected/ba1da9df-c8bd-41d0-bc00-09688e27984e-kube-api-access-6hcd6\") pod \"node-resolver-hmjzc\" (UID: \"ba1da9df-c8bd-41d0-bc00-09688e27984e\") " pod="openshift-dns/node-resolver-hmjzc" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115798 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-system-cni-dir\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115825 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-run-k8s-cni-cncf-io\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115855 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ba1da9df-c8bd-41d0-bc00-09688e27984e-hosts-file\") pod \"node-resolver-hmjzc\" (UID: \"ba1da9df-c8bd-41d0-bc00-09688e27984e\") " pod="openshift-dns/node-resolver-hmjzc" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115883 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-socket-dir-parent\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115897 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-run-k8s-cni-cncf-io\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115907 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-var-lib-kubelet\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115943 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-var-lib-kubelet\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115955 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a216adb-632d-4134-8c61-61fe6b8c5f71-cni-binary-copy\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.115984 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-cni-dir\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.116002 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-var-lib-cni-multus\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.116010 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ba1da9df-c8bd-41d0-bc00-09688e27984e-hosts-file\") pod \"node-resolver-hmjzc\" (UID: \"ba1da9df-c8bd-41d0-bc00-09688e27984e\") " pod="openshift-dns/node-resolver-hmjzc" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.116018 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-system-cni-dir\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.116072 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-socket-dir-parent\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.116025 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-daemon-config\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.116132 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-host-var-lib-cni-multus\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.116189 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-cni-dir\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.116780 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a216adb-632d-4134-8c61-61fe6b8c5f71-multus-daemon-config\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.116808 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a216adb-632d-4134-8c61-61fe6b8c5f71-cni-binary-copy\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.130127 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.141963 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hcd6\" (UniqueName: \"kubernetes.io/projected/ba1da9df-c8bd-41d0-bc00-09688e27984e-kube-api-access-6hcd6\") pod \"node-resolver-hmjzc\" (UID: \"ba1da9df-c8bd-41d0-bc00-09688e27984e\") " pod="openshift-dns/node-resolver-hmjzc" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.143368 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w565t\" (UniqueName: \"kubernetes.io/projected/6a216adb-632d-4134-8c61-61fe6b8c5f71-kube-api-access-w565t\") pod \"multus-98mjq\" (UID: \"6a216adb-632d-4134-8c61-61fe6b8c5f71\") " pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.148354 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.161500 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.184773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.184815 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.184826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.184844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.184857 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:36Z","lastTransitionTime":"2025-12-03T14:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.192826 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.194765 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g8hvv" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.205002 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hmjzc" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.206232 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: W1203 14:13:36.207715 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5059484e_1748_4e33_89c0_3d5e57b97489.slice/crio-56f66890663313dff0d5e1db3aca7c78db14683e5afc848a180126378c939766 WatchSource:0}: Error finding container 56f66890663313dff0d5e1db3aca7c78db14683e5afc848a180126378c939766: Status 404 returned error can't find the container with id 56f66890663313dff0d5e1db3aca7c78db14683e5afc848a180126378c939766 Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.221806 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.232114 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.243941 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.260350 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.267267 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-98mjq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.278169 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.287125 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.287185 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.287200 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.287217 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.287229 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:36Z","lastTransitionTime":"2025-12-03T14:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.290021 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.367439 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-djf67"] Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.367882 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.370145 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.371741 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.371802 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.371871 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.371746 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.374037 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-t8q27"] Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.374675 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.376425 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.376624 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.387104 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.390684 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.390711 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.390720 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.390737 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.390748 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:36Z","lastTransitionTime":"2025-12-03T14:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.398455 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.409366 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.418575 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-os-release\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.418626 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.418649 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/385620eb-744d-423e-b02b-1274f3075689-proxy-tls\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.418672 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/385620eb-744d-423e-b02b-1274f3075689-mcd-auth-proxy-config\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.418696 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.418782 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/385620eb-744d-423e-b02b-1274f3075689-rootfs\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.418839 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm85t\" (UniqueName: \"kubernetes.io/projected/385620eb-744d-423e-b02b-1274f3075689-kube-api-access-vm85t\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.418918 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-cni-binary-copy\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.418967 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-system-cni-dir\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.419022 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gzr8\" (UniqueName: \"kubernetes.io/projected/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-kube-api-access-9gzr8\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.419051 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-cnibin\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.423013 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.435728 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.436490 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g8hvv" event={"ID":"5059484e-1748-4e33-89c0-3d5e57b97489","Type":"ContainerStarted","Data":"56f66890663313dff0d5e1db3aca7c78db14683e5afc848a180126378c939766"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.455217 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.466490 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.477865 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.490859 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.492609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.492644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.492653 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.492665 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.492682 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:36Z","lastTransitionTime":"2025-12-03T14:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.506144 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.518558 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.519845 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.519888 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/385620eb-744d-423e-b02b-1274f3075689-proxy-tls\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.519912 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/385620eb-744d-423e-b02b-1274f3075689-mcd-auth-proxy-config\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.519936 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.519972 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-cni-binary-copy\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.519992 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/385620eb-744d-423e-b02b-1274f3075689-rootfs\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.520012 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm85t\" (UniqueName: \"kubernetes.io/projected/385620eb-744d-423e-b02b-1274f3075689-kube-api-access-vm85t\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.520069 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-system-cni-dir\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.520092 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gzr8\" (UniqueName: \"kubernetes.io/projected/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-kube-api-access-9gzr8\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.520475 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-system-cni-dir\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.520523 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-cnibin\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.520604 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-os-release\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.520719 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-os-release\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.520749 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-cnibin\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.521092 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-cni-binary-copy\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.521251 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.521296 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/385620eb-744d-423e-b02b-1274f3075689-mcd-auth-proxy-config\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.523341 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/385620eb-744d-423e-b02b-1274f3075689-proxy-tls\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.523466 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/385620eb-744d-423e-b02b-1274f3075689-rootfs\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.533607 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.540471 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gzr8\" (UniqueName: \"kubernetes.io/projected/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-kube-api-access-9gzr8\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.542437 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm85t\" (UniqueName: \"kubernetes.io/projected/385620eb-744d-423e-b02b-1274f3075689-kube-api-access-vm85t\") pod \"machine-config-daemon-djf67\" (UID: \"385620eb-744d-423e-b02b-1274f3075689\") " pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.555891 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.578809 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.594729 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.594759 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.594766 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.594778 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.594786 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:36Z","lastTransitionTime":"2025-12-03T14:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.600031 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.612060 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.624059 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.635539 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.644449 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.662308 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.671534 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.680624 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.681729 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.697043 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.697095 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.697205 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.697219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.697236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.697249 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:36Z","lastTransitionTime":"2025-12-03T14:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.708261 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.720075 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.742497 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kwchh"] Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.743237 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.744758 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.745148 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.745224 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.745232 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.745349 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.746215 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.746618 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.756352 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.770665 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.781784 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.788531 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t8q27\" (UID: \"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\") " pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.800676 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.800717 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.800731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.800752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.800768 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:36Z","lastTransitionTime":"2025-12-03T14:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.813829 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823205 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-node-log\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823255 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-script-lib\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823282 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-env-overrides\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823307 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823364 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-etc-openvswitch\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823391 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823413 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-ovn\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823431 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-bin\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823451 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-systemd\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823470 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-config\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823493 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-netns\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823514 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbql\" (UniqueName: \"kubernetes.io/projected/a5526cae-f2a4-4094-a08a-fbf69cb11593-kube-api-access-gbbql\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823537 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-kubelet\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823556 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-log-socket\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: W1203 14:13:36.823567 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a216adb_632d_4134_8c61_61fe6b8c5f71.slice/crio-f830c6256521b1429fa51f07de3de27cc46c71d3f310fdd64aa2548a5190ea74 WatchSource:0}: Error finding container f830c6256521b1429fa51f07de3de27cc46c71d3f310fdd64aa2548a5190ea74: Status 404 returned error can't find the container with id f830c6256521b1429fa51f07de3de27cc46c71d3f310fdd64aa2548a5190ea74 Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823592 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovn-node-metrics-cert\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823741 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-systemd-units\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823832 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-openvswitch\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823915 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-slash\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823933 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-var-lib-openvswitch\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.823957 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-netd\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.847802 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.863600 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.887781 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.902693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.902738 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.902752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.902768 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.902779 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:36Z","lastTransitionTime":"2025-12-03T14:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.904807 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.915498 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.924981 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-env-overrides\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925041 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925065 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-etc-openvswitch\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925090 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925114 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-ovn\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925135 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-bin\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925154 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-systemd\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925177 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-config\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925202 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-netns\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925222 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbql\" (UniqueName: \"kubernetes.io/projected/a5526cae-f2a4-4094-a08a-fbf69cb11593-kube-api-access-gbbql\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925245 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-kubelet\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925265 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-log-socket\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925285 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovn-node-metrics-cert\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925309 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-systemd-units\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925361 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-openvswitch\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925405 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-slash\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925428 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-var-lib-openvswitch\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925446 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-netd\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925466 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-node-log\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.925516 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-script-lib\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926152 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-openvswitch\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926220 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-systemd-units\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926259 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-ovn\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926212 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-var-lib-openvswitch\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926276 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-netd\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926309 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-kubelet\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926358 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-node-log\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926366 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-systemd\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926359 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-slash\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926390 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-bin\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926401 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-log-socket\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926419 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926422 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-etc-openvswitch\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926410 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-netns\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926455 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926657 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-env-overrides\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.926867 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-config\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.930558 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-script-lib\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.936128 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovn-node-metrics-cert\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.942770 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.947901 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbql\" (UniqueName: \"kubernetes.io/projected/a5526cae-f2a4-4094-a08a-fbf69cb11593-kube-api-access-gbbql\") pod \"ovnkube-node-kwchh\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.974538 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:36 crc kubenswrapper[4751]: I1203 14:13:36.987803 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t8q27" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.005597 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.006685 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.006701 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.006717 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.006729 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.006737 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.021253 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.037941 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.055521 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.110262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.110350 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.110363 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.110379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.110393 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.213931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.214265 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.214278 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.214293 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.214304 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.313911 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.313968 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.313911 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:37 crc kubenswrapper[4751]: E1203 14:13:37.314045 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:37 crc kubenswrapper[4751]: E1203 14:13:37.314093 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:37 crc kubenswrapper[4751]: E1203 14:13:37.314195 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.316478 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.316512 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.316520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.316533 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.316543 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.419138 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.419178 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.419188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.419202 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.419212 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.440911 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hmjzc" event={"ID":"ba1da9df-c8bd-41d0-bc00-09688e27984e","Type":"ContainerStarted","Data":"3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.440955 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hmjzc" event={"ID":"ba1da9df-c8bd-41d0-bc00-09688e27984e","Type":"ContainerStarted","Data":"448915f5979653308fe77718e057479dc289a34eb644388db6fbdceb935e2b1d"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.442829 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" event={"ID":"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8","Type":"ContainerStarted","Data":"6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.442880 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" event={"ID":"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8","Type":"ContainerStarted","Data":"e97176fa427527afb9d506042947f03c891342b6a44071ca05da0717b11c3a43"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.444566 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g8hvv" event={"ID":"5059484e-1748-4e33-89c0-3d5e57b97489","Type":"ContainerStarted","Data":"97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.446741 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4" exitCode=0 Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.446878 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.446916 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"22db7d10f75c044e4170cf3156b0cec07a600ead0b28caeb9390c7167bbbcfbc"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.448971 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.449006 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.449019 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"09d9b8dc6f4afb311e04bf029fbcbbbf918dfc43df539048ff245504568a6307"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.451772 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98mjq" event={"ID":"6a216adb-632d-4134-8c61-61fe6b8c5f71","Type":"ContainerStarted","Data":"32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.451808 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98mjq" event={"ID":"6a216adb-632d-4134-8c61-61fe6b8c5f71","Type":"ContainerStarted","Data":"f830c6256521b1429fa51f07de3de27cc46c71d3f310fdd64aa2548a5190ea74"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.460383 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.471594 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.492387 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.507092 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.521134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.521166 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.521176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.521188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.521228 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.522393 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.539131 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.550492 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.562872 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.576103 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.600840 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.613464 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.623484 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.623515 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.623523 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.623537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.623546 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.629203 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.642576 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.654610 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.676382 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.690381 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.703049 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.715649 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.725134 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.726306 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.726365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.726378 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.726395 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.726408 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.735593 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.747513 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.781140 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.810881 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.829076 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.829118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.829127 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.829140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.829151 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.851199 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.893111 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.931354 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.931487 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.931561 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.931628 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.931670 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:37Z","lastTransitionTime":"2025-12-03T14:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.935872 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:37 crc kubenswrapper[4751]: I1203 14:13:37.972659 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.013167 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.034086 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.034304 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.034315 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.034346 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.034358 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.128270 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.133030 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.137198 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.137393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.137471 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.137550 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.137624 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.137682 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.139444 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.152258 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.167632 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.194403 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.235309 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.239486 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.239538 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.239547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.239565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.239575 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.272476 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.312240 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.341683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.341735 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.341747 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.341764 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.341775 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.358923 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.399779 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.431622 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.444772 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.444850 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.444863 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.444887 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.444902 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.457790 4751 generic.go:334] "Generic (PLEG): container finished" podID="a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8" containerID="6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064" exitCode=0 Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.457866 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" event={"ID":"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8","Type":"ContainerDied","Data":"6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.461141 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.461237 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.461306 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.461400 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.472254 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: E1203 14:13:38.491450 4751 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.534581 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.553068 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.553134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.553145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.553166 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.553186 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.576447 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.615886 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.654971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.655011 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.655023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.655039 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.655051 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.658558 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.698894 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.740563 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.758155 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.758198 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.758210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.758228 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.758242 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.773435 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.815464 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.856082 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.861044 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.861089 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.861099 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.861113 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.861124 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.890549 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.936613 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.963775 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.963845 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.963853 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.963868 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.963877 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:38Z","lastTransitionTime":"2025-12-03T14:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:38 crc kubenswrapper[4751]: I1203 14:13:38.973140 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:38Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.012417 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.048909 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.049103 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:13:47.049074221 +0000 UTC m=+34.037429438 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.049262 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.049345 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.049454 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.049505 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.049544 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:47.049536493 +0000 UTC m=+34.037891710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.049559 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:47.049552614 +0000 UTC m=+34.037907831 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.058047 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.066413 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.066481 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.066504 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.066533 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.066557 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.091449 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.130317 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.150117 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.150223 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.150286 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.150318 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.150345 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.150409 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:47.150393001 +0000 UTC m=+34.138748218 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.150409 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.150433 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.150465 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.150527 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:47.150508984 +0000 UTC m=+34.138864201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.168683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.168721 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.168736 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.168753 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.168766 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.173853 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.212386 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.271533 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.271625 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.271643 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.271668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.271690 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.314097 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.314175 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.314109 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.314237 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.314304 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:39 crc kubenswrapper[4751]: E1203 14:13:39.314436 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.374169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.374202 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.374213 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.374227 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.374238 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.466302 4751 generic.go:334] "Generic (PLEG): container finished" podID="a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8" containerID="3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f" exitCode=0 Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.466347 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" event={"ID":"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8","Type":"ContainerDied","Data":"3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.475075 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.475132 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.478220 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.478261 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.478273 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.478290 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.478302 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.487065 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.510107 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.528720 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.542074 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.560836 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.574628 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.581973 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.582024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.582034 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.582054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.582068 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.588560 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.603165 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.615029 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.628031 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.653480 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.685028 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.685083 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.685094 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.685116 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.685133 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.692428 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.739050 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.772138 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.788420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.788479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.788491 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.788511 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.788526 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.812942 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.891421 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.891492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.891507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.891528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.891544 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.993787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.993876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.993900 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.993931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:39 crc kubenswrapper[4751]: I1203 14:13:39.993950 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:39Z","lastTransitionTime":"2025-12-03T14:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.097521 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.097571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.097586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.097604 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.097615 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:40Z","lastTransitionTime":"2025-12-03T14:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.200110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.200160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.200169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.200187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.200197 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:40Z","lastTransitionTime":"2025-12-03T14:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.301887 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.301936 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.301946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.301962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.301972 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:40Z","lastTransitionTime":"2025-12-03T14:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.404605 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.404667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.404688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.404713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.404732 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:40Z","lastTransitionTime":"2025-12-03T14:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.478744 4751 generic.go:334] "Generic (PLEG): container finished" podID="a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8" containerID="7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873" exitCode=0 Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.478798 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" event={"ID":"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8","Type":"ContainerDied","Data":"7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.493970 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.507881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.507922 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.507932 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.507946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.507956 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:40Z","lastTransitionTime":"2025-12-03T14:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.514706 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.534050 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.549434 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.563114 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.574913 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.588020 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.600070 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.610883 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.610923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.610933 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.610948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.610958 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:40Z","lastTransitionTime":"2025-12-03T14:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.613191 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.631928 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.645165 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.656995 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.668996 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.679660 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.688978 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:40Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.713288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.713363 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.713377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.713396 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.713410 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:40Z","lastTransitionTime":"2025-12-03T14:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.816133 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.816183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.816191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.816206 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.816216 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:40Z","lastTransitionTime":"2025-12-03T14:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.918513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.918576 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.918596 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.918615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:40 crc kubenswrapper[4751]: I1203 14:13:40.918627 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:40Z","lastTransitionTime":"2025-12-03T14:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.020774 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.020812 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.020821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.020834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.020844 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.123558 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.123597 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.123610 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.123626 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.123637 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.226168 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.226219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.226233 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.226250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.226265 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.313626 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.313743 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:41 crc kubenswrapper[4751]: E1203 14:13:41.313780 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.313887 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:41 crc kubenswrapper[4751]: E1203 14:13:41.314032 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:41 crc kubenswrapper[4751]: E1203 14:13:41.314145 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.327822 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.327866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.327876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.327891 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.327908 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.430309 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.430365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.430373 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.430387 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.430396 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.489136 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.491712 4751 generic.go:334] "Generic (PLEG): container finished" podID="a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8" containerID="916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d" exitCode=0 Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.491740 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" event={"ID":"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8","Type":"ContainerDied","Data":"916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.509111 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.524514 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.532122 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.532152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.532160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.532172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.532181 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.537152 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.556634 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.570793 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.592849 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.609017 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.621935 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.635035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.635073 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.635091 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.635104 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.635114 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.635194 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.645495 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.658210 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.671493 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.684783 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.709963 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.723706 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:41Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.738101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.738141 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.738152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.738169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.738181 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.839959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.840198 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.840257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.840340 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.840419 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.943225 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.943260 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.943268 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.943283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:41 crc kubenswrapper[4751]: I1203 14:13:41.943293 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:41Z","lastTransitionTime":"2025-12-03T14:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.045780 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.045821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.045834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.045848 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.045858 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.148461 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.148508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.148523 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.148544 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.148556 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.250162 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.250203 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.250214 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.250230 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.250244 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.352766 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.352834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.352856 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.352886 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.352909 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.455366 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.455407 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.455423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.455442 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.455454 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.501127 4751 generic.go:334] "Generic (PLEG): container finished" podID="a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8" containerID="392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2" exitCode=0 Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.501178 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" event={"ID":"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8","Type":"ContainerDied","Data":"392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.514897 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.538990 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.550299 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.557631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.557682 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.557693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.557714 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.557727 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.563301 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.581480 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.594012 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.619708 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.653825 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.667461 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.667502 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.667511 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.667527 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.667538 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.769680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.769720 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.769731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.769745 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.769755 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.871948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.871974 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.871982 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.871994 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.872003 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.955096 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.970784 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.974794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.974826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.974837 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.974851 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.974860 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:42Z","lastTransitionTime":"2025-12-03T14:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.986660 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:42 crc kubenswrapper[4751]: I1203 14:13:42.997702 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:42Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.010357 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.024569 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.044124 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.077819 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.077855 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.077866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.077883 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.077897 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:43Z","lastTransitionTime":"2025-12-03T14:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.180040 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.180084 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.180093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.180107 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.180116 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:43Z","lastTransitionTime":"2025-12-03T14:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.282133 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.282168 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.282179 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.282193 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.282205 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:43Z","lastTransitionTime":"2025-12-03T14:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.313486 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:43 crc kubenswrapper[4751]: E1203 14:13:43.313648 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.313723 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:43 crc kubenswrapper[4751]: E1203 14:13:43.313809 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.314224 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:43 crc kubenswrapper[4751]: E1203 14:13:43.314318 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.324851 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.335958 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.347696 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.358986 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.376486 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.384046 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.384073 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.384082 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.384095 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.384104 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:43Z","lastTransitionTime":"2025-12-03T14:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.392369 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.408290 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.420561 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.436999 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.453404 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.465597 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.486068 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.486687 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.486715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.486725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.486739 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.486749 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:43Z","lastTransitionTime":"2025-12-03T14:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.501246 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.507171 4751 generic.go:334] "Generic (PLEG): container finished" podID="a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8" containerID="c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63" exitCode=0 Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.507245 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" event={"ID":"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8","Type":"ContainerDied","Data":"c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.513615 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.513906 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.515560 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.534418 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.536856 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.548005 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.560676 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.573565 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.587766 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.589864 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.589948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.589964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.590012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.590024 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:43Z","lastTransitionTime":"2025-12-03T14:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.598554 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.617050 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.629783 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.649769 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.661018 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.671263 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.684820 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.692978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.693030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.693042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.693060 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.693071 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:43Z","lastTransitionTime":"2025-12-03T14:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.697935 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.713052 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.723401 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.734893 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.745639 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.758307 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.770311 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.781701 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.793417 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.794973 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.794999 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.795007 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.795019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.795028 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:43Z","lastTransitionTime":"2025-12-03T14:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.808302 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.831553 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.849714 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.866748 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.880962 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.891424 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.898523 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.898553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.898562 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.898575 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.898586 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:43Z","lastTransitionTime":"2025-12-03T14:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.903977 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.918800 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.937717 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:43 crc kubenswrapper[4751]: I1203 14:13:43.957929 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.002358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.002415 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.002431 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.002470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.002486 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.104903 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.104949 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.104967 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.104989 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.105005 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.207478 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.207505 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.207514 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.207525 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.207534 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.309683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.309745 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.309760 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.309777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.309789 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.396864 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.396904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.396916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.396931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.396941 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: E1203 14:13:44.410301 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.413794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.413831 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.413845 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.413862 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.413872 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: E1203 14:13:44.426511 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.429941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.429994 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.430004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.430022 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.430033 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: E1203 14:13:44.443799 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.448954 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.448998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.449007 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.449021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.449030 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: E1203 14:13:44.464771 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.468924 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.468959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.468969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.468987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.468997 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: E1203 14:13:44.479622 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: E1203 14:13:44.479732 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.481595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.481660 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.481679 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.481744 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.481762 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.524068 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" event={"ID":"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8","Type":"ContainerStarted","Data":"d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.524166 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.525373 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.537455 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.549067 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.552705 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.565138 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.581744 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.584639 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.584686 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.584696 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.584713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.584726 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.602102 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.616021 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.632412 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.651377 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.667588 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.688800 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.688843 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.688851 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.688866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.688875 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.693562 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.711496 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.743226 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.763019 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.772369 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.784102 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.790976 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.791034 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.791047 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.791068 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.791080 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.796079 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.813752 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.824058 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.834678 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.850421 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.867314 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.886924 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.893791 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.893823 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.893831 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.893844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.893853 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.903766 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.916680 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.931882 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.945999 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.960103 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.972519 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.986001 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:44Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.996297 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.996352 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.996362 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.996393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:44 crc kubenswrapper[4751]: I1203 14:13:44.996403 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:44Z","lastTransitionTime":"2025-12-03T14:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.006591 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:45Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.099417 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.099474 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.099486 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.099504 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.099515 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:45Z","lastTransitionTime":"2025-12-03T14:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.202783 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.202832 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.202846 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.202866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.202879 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:45Z","lastTransitionTime":"2025-12-03T14:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.305707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.305757 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.305773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.305795 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.305809 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:45Z","lastTransitionTime":"2025-12-03T14:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.313314 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.313368 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.313318 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:45 crc kubenswrapper[4751]: E1203 14:13:45.313521 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:45 crc kubenswrapper[4751]: E1203 14:13:45.313659 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:45 crc kubenswrapper[4751]: E1203 14:13:45.313743 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.408836 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.408938 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.408960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.409437 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.409735 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:45Z","lastTransitionTime":"2025-12-03T14:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.513775 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.513847 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.513857 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.513878 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.513894 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:45Z","lastTransitionTime":"2025-12-03T14:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.527881 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.617157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.617214 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.617228 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.617251 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.617265 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:45Z","lastTransitionTime":"2025-12-03T14:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.719884 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.719956 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.719969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.719985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.719996 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:45Z","lastTransitionTime":"2025-12-03T14:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.822818 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.823094 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.823110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.823126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.823136 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:45Z","lastTransitionTime":"2025-12-03T14:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.925377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.925469 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.925479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.925495 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:45 crc kubenswrapper[4751]: I1203 14:13:45.925506 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:45Z","lastTransitionTime":"2025-12-03T14:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.028380 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.028429 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.028444 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.028461 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.028474 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.132057 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.132115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.132128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.132151 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.132165 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.235177 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.235221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.235231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.235247 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.235260 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.337836 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.337881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.337895 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.337914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.337927 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.441065 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.441115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.441125 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.441141 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.441151 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.534566 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/0.log" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.537213 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb" exitCode=1 Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.537250 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.538752 4751 scope.go:117] "RemoveContainer" containerID="97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.542636 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.542696 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.542713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.542734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.542750 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.556192 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.581598 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.595914 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.608347 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.624266 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.637032 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.645621 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.645645 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.645653 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.645665 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.645674 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.662411 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.675893 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.689409 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.705564 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.729520 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.748673 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.748722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.748740 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.748764 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.748778 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.749602 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.762432 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.774130 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.792386 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:45Z\\\",\\\"message\\\":\\\"4:13:45.698653 6080 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 14:13:45.698658 6080 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698578 6080 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 14:13:45.698695 6080 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698727 6080 factory.go:656] Stopping watch factory\\\\nI1203 14:13:45.698755 6080 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 14:13:45.698633 6080 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 14:13:45.698670 6080 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.698828 6080 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 14:13:45.698844 6080 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.699545 6080 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.851045 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.851092 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.851104 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.851121 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.851134 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.953767 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.953809 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.953822 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.953840 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.953851 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:46Z","lastTransitionTime":"2025-12-03T14:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.960231 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.973399 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:46 crc kubenswrapper[4751]: I1203 14:13:46.986602 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.015465 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.024991 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.045834 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.056355 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.056390 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.056401 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.056415 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.056425 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.059661 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.079909 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.092805 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.103513 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.119703 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:45Z\\\",\\\"message\\\":\\\"4:13:45.698653 6080 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 14:13:45.698658 6080 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698578 6080 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 14:13:45.698695 6080 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698727 6080 factory.go:656] Stopping watch factory\\\\nI1203 14:13:45.698755 6080 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 14:13:45.698633 6080 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 14:13:45.698670 6080 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.698828 6080 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 14:13:45.698844 6080 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.699545 6080 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.133061 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.135239 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.135371 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.135411 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.135480 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:14:03.135454193 +0000 UTC m=+50.123809410 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.135507 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.135553 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:03.135542175 +0000 UTC m=+50.123897392 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.135599 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.135676 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:03.135657088 +0000 UTC m=+50.124012305 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.144025 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.157089 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.158369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.158398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.158409 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.158424 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.158434 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.173036 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.185685 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.235994 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.236035 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.236153 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.236169 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.236179 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.236215 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.236241 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:03.236214008 +0000 UTC m=+50.224569225 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.236247 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.236261 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.236313 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:03.23629652 +0000 UTC m=+50.224651737 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.260601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.260640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.260652 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.260667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.260680 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.313114 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.313157 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.313243 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.313341 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.313477 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:47 crc kubenswrapper[4751]: E1203 14:13:47.313555 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.362642 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.362687 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.362699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.362716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.362727 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.465082 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.465111 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.465120 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.465134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.465144 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.542674 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/0.log" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.545976 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.546074 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.564960 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.567458 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.567508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.567525 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.567548 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.567565 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.583997 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.598820 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.612598 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.624697 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.636720 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.658546 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:45Z\\\",\\\"message\\\":\\\"4:13:45.698653 6080 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 14:13:45.698658 6080 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698578 6080 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 14:13:45.698695 6080 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698727 6080 factory.go:656] Stopping watch factory\\\\nI1203 14:13:45.698755 6080 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 14:13:45.698633 6080 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 14:13:45.698670 6080 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.698828 6080 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 14:13:45.698844 6080 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.699545 6080 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.670252 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.670302 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.670314 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.670353 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.670368 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.674673 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.688702 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.711157 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.723848 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.734000 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.748142 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.762061 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.773081 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.773125 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.773137 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.773154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.773166 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.776004 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.875444 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.875499 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.875514 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.875533 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.875548 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.977822 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.977880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.977909 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.977933 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:47 crc kubenswrapper[4751]: I1203 14:13:47.977951 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:47Z","lastTransitionTime":"2025-12-03T14:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.080228 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.080270 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.080283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.080299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.080311 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:48Z","lastTransitionTime":"2025-12-03T14:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.183261 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.183294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.183303 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.183316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.183337 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:48Z","lastTransitionTime":"2025-12-03T14:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.285674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.285750 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.285763 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.285789 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.285802 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:48Z","lastTransitionTime":"2025-12-03T14:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.388187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.388267 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.388277 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.388293 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.388306 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:48Z","lastTransitionTime":"2025-12-03T14:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.490564 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.490606 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.490614 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.490627 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.490638 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:48Z","lastTransitionTime":"2025-12-03T14:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.552260 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/1.log" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.553073 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/0.log" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.556383 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74" exitCode=1 Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.556432 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.556564 4751 scope.go:117] "RemoveContainer" containerID="97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.557489 4751 scope.go:117] "RemoveContainer" containerID="cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74" Dec 03 14:13:48 crc kubenswrapper[4751]: E1203 14:13:48.557785 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.577850 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.592832 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.592885 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.592902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.592925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.592946 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:48Z","lastTransitionTime":"2025-12-03T14:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.618204 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.637190 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.656730 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.677205 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:45Z\\\",\\\"message\\\":\\\"4:13:45.698653 6080 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 14:13:45.698658 6080 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698578 6080 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 14:13:45.698695 6080 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698727 6080 factory.go:656] Stopping watch factory\\\\nI1203 14:13:45.698755 6080 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 14:13:45.698633 6080 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 14:13:45.698670 6080 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.698828 6080 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 14:13:45.698844 6080 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.699545 6080 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.689749 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.695626 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.695666 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.695675 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.695688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.695697 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:48Z","lastTransitionTime":"2025-12-03T14:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.698809 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.709777 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.718583 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.726231 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.739865 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.756872 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.786640 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.798249 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.798312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.798343 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.798360 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.798371 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:48Z","lastTransitionTime":"2025-12-03T14:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.800087 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.818946 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.900133 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.900203 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.900234 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.900253 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:48 crc kubenswrapper[4751]: I1203 14:13:48.900266 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:48Z","lastTransitionTime":"2025-12-03T14:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.003173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.003221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.003232 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.003251 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.003264 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.106459 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.106512 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.106527 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.106547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.106562 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.243817 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.243870 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.243880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.243902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.243915 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.313888 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.313935 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:49 crc kubenswrapper[4751]: E1203 14:13:49.314069 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.314148 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:49 crc kubenswrapper[4751]: E1203 14:13:49.314294 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:49 crc kubenswrapper[4751]: E1203 14:13:49.314569 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.317700 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz"] Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.318038 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.319761 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.320089 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.337079 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.346967 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.347030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.347052 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.347081 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.347105 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.353070 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.366418 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.376943 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.387126 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.398126 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.414438 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.424143 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.434668 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.449603 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.449650 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.449666 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.449687 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.449705 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.452021 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97b607a04ff07f1c875428940d49a03ce6b11276ccc6f9bb53dcb670e11b7ddb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:45Z\\\",\\\"message\\\":\\\"4:13:45.698653 6080 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 14:13:45.698658 6080 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698578 6080 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 14:13:45.698695 6080 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 14:13:45.698727 6080 factory.go:656] Stopping watch factory\\\\nI1203 14:13:45.698755 6080 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 14:13:45.698633 6080 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 14:13:45.698670 6080 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.698828 6080 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 14:13:45.698844 6080 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:13:45.699545 6080 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.456273 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.456314 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.456373 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.456413 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrsqb\" (UniqueName: \"kubernetes.io/projected/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-kube-api-access-jrsqb\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.465558 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.478459 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.495662 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.505902 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.516507 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.536083 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.552137 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.552201 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.552227 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.552258 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.552281 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.557194 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.557418 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.557520 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.557695 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrsqb\" (UniqueName: \"kubernetes.io/projected/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-kube-api-access-jrsqb\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.558412 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.558797 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.560651 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/1.log" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.565030 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.565136 4751 scope.go:117] "RemoveContainer" containerID="cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74" Dec 03 14:13:49 crc kubenswrapper[4751]: E1203 14:13:49.565517 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.578589 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.589708 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrsqb\" (UniqueName: \"kubernetes.io/projected/e0557f3d-915e-4c1c-b9a0-de7b3b8dabea-kube-api-access-jrsqb\") pod \"ovnkube-control-plane-749d76644c-dp4lz\" (UID: \"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.596543 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.612584 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.627829 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.631140 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.644346 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: W1203 14:13:49.645192 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0557f3d_915e_4c1c_b9a0_de7b3b8dabea.slice/crio-a5239f61c3ed20bf6d01d63df747525062feacf79d6ee08d73fd1b01c07bfa16 WatchSource:0}: Error finding container a5239f61c3ed20bf6d01d63df747525062feacf79d6ee08d73fd1b01c07bfa16: Status 404 returned error can't find the container with id a5239f61c3ed20bf6d01d63df747525062feacf79d6ee08d73fd1b01c07bfa16 Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.655241 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.655269 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.655276 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.655288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.655299 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.663036 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.685049 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.700391 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.712982 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.734887 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.744118 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.753926 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.757565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.757601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.757611 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.757633 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.757652 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.766295 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.779701 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.789723 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.800798 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.859891 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.859921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.859931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.859946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.859955 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.962629 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.962672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.962682 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.962696 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:49 crc kubenswrapper[4751]: I1203 14:13:49.962707 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:49Z","lastTransitionTime":"2025-12-03T14:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.062006 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zgqdp"] Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.062712 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:50 crc kubenswrapper[4751]: E1203 14:13:50.062801 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.065008 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.065074 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.065089 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.065109 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.065122 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.072785 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.086783 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.099783 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.114026 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.135991 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.149897 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.162542 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.162631 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgzgd\" (UniqueName: \"kubernetes.io/projected/45fb8744-4cb9-4138-8310-c02f7c6a2941-kube-api-access-sgzgd\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.162673 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.167860 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.167897 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.167907 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.167922 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.167935 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.178161 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.192420 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.205653 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.219839 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.231770 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.246355 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.263441 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgzgd\" (UniqueName: \"kubernetes.io/projected/45fb8744-4cb9-4138-8310-c02f7c6a2941-kube-api-access-sgzgd\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.263496 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:50 crc kubenswrapper[4751]: E1203 14:13:50.263631 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:50 crc kubenswrapper[4751]: E1203 14:13:50.263690 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs podName:45fb8744-4cb9-4138-8310-c02f7c6a2941 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:50.76367493 +0000 UTC m=+37.752030137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs") pod "network-metrics-daemon-zgqdp" (UID: "45fb8744-4cb9-4138-8310-c02f7c6a2941") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.270903 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.270945 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.270957 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.270989 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.271000 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.276128 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.289383 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgzgd\" (UniqueName: \"kubernetes.io/projected/45fb8744-4cb9-4138-8310-c02f7c6a2941-kube-api-access-sgzgd\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.290932 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.305903 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.321382 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.373932 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.373969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.373978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.373993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.374006 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.476762 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.476820 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.476836 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.476897 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.476915 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.568794 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" event={"ID":"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea","Type":"ContainerStarted","Data":"cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.568869 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" event={"ID":"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea","Type":"ContainerStarted","Data":"243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.568884 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" event={"ID":"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea","Type":"ContainerStarted","Data":"a5239f61c3ed20bf6d01d63df747525062feacf79d6ee08d73fd1b01c07bfa16"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.579229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.579288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.579302 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.579320 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.579355 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.584632 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.607085 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.629688 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.647929 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.669859 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.682374 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.682434 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.682451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.682473 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.682489 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.691031 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.706678 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.721689 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.736489 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.749452 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.767208 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:50 crc kubenswrapper[4751]: E1203 14:13:50.767441 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:50 crc kubenswrapper[4751]: E1203 14:13:50.767561 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs podName:45fb8744-4cb9-4138-8310-c02f7c6a2941 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:51.767537657 +0000 UTC m=+38.755892884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs") pod "network-metrics-daemon-zgqdp" (UID: "45fb8744-4cb9-4138-8310-c02f7c6a2941") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.770100 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.784052 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.785896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.785974 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.785998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.786029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.786049 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.798678 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.810981 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.828688 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.839978 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.849820 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.888642 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.888694 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.888710 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.888727 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.888738 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.991428 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.991476 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.991485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.991499 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:50 crc kubenswrapper[4751]: I1203 14:13:50.991510 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:50Z","lastTransitionTime":"2025-12-03T14:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.093965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.094012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.094023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.094038 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.094049 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:51Z","lastTransitionTime":"2025-12-03T14:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.196235 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.196283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.196292 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.196310 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.196343 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:51Z","lastTransitionTime":"2025-12-03T14:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.298889 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.298932 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.298942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.298957 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.298968 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:51Z","lastTransitionTime":"2025-12-03T14:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.313397 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.313426 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.313437 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.313462 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:51 crc kubenswrapper[4751]: E1203 14:13:51.313515 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:51 crc kubenswrapper[4751]: E1203 14:13:51.313624 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:51 crc kubenswrapper[4751]: E1203 14:13:51.313712 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:51 crc kubenswrapper[4751]: E1203 14:13:51.313775 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.400756 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.400818 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.400834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.400860 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.400879 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:51Z","lastTransitionTime":"2025-12-03T14:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.502758 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.502789 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.502798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.502811 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.502821 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:51Z","lastTransitionTime":"2025-12-03T14:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.605686 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.605729 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.605740 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.605754 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.605765 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:51Z","lastTransitionTime":"2025-12-03T14:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.708258 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.708289 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.708298 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.708312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.708363 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:51Z","lastTransitionTime":"2025-12-03T14:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.777006 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:51 crc kubenswrapper[4751]: E1203 14:13:51.777204 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:51 crc kubenswrapper[4751]: E1203 14:13:51.777274 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs podName:45fb8744-4cb9-4138-8310-c02f7c6a2941 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:53.777257212 +0000 UTC m=+40.765612429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs") pod "network-metrics-daemon-zgqdp" (UID: "45fb8744-4cb9-4138-8310-c02f7c6a2941") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.815978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.816500 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.816586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.816668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.816733 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:51Z","lastTransitionTime":"2025-12-03T14:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.919116 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.919661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.919782 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.919884 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:51 crc kubenswrapper[4751]: I1203 14:13:51.919996 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:51Z","lastTransitionTime":"2025-12-03T14:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.023992 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.024035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.024045 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.024062 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.024074 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.126508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.126940 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.127033 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.127112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.127178 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.229881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.230178 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.230241 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.230313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.230410 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.332868 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.332902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.332913 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.332928 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.332937 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.435481 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.435526 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.435538 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.435553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.435564 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.537250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.537300 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.537309 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.537347 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.537362 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.639420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.639456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.639467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.639482 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.639493 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.741414 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.741459 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.741470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.741490 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.741502 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.843936 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.843981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.843994 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.844011 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.844023 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.946183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.946236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.946248 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.946266 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:52 crc kubenswrapper[4751]: I1203 14:13:52.946279 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:52Z","lastTransitionTime":"2025-12-03T14:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.048753 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.048785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.048793 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.048806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.048816 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.150662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.150705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.150719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.150735 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.150745 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.252504 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.252544 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.252554 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.252568 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.252578 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.312998 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.313006 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.313058 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.313071 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:53 crc kubenswrapper[4751]: E1203 14:13:53.313638 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:53 crc kubenswrapper[4751]: E1203 14:13:53.313662 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:53 crc kubenswrapper[4751]: E1203 14:13:53.313675 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:13:53 crc kubenswrapper[4751]: E1203 14:13:53.313681 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.331917 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.344937 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.354555 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.354588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.354599 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.354614 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.354664 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.358994 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.373560 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.385808 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.397032 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.413577 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.426807 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.456716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.456751 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.456760 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.456773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.456783 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.458214 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.471855 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.492895 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.507755 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.524739 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.537532 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.551472 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.559659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.559707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.559718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.559738 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.559753 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.566273 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.586985 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.663428 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.663912 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.664083 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.664208 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.664308 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.766855 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.766916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.766928 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.766951 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.766967 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.797749 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:53 crc kubenswrapper[4751]: E1203 14:13:53.797976 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:53 crc kubenswrapper[4751]: E1203 14:13:53.798113 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs podName:45fb8744-4cb9-4138-8310-c02f7c6a2941 nodeName:}" failed. No retries permitted until 2025-12-03 14:13:57.79807354 +0000 UTC m=+44.786428827 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs") pod "network-metrics-daemon-zgqdp" (UID: "45fb8744-4cb9-4138-8310-c02f7c6a2941") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.870414 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.870499 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.870523 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.870556 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.870580 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.973511 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.973580 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.973595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.973618 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:53 crc kubenswrapper[4751]: I1203 14:13:53.973633 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:53Z","lastTransitionTime":"2025-12-03T14:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.076784 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.076861 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.076884 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.076914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.076939 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.180362 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.180412 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.180433 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.180452 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.180464 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.283827 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.283948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.283966 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.283987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.284007 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.386925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.386985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.387001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.387023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.387043 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.489809 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.489866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.489885 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.489907 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.489926 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.509140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.509187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.509204 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.509225 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.509242 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: E1203 14:13:54.533077 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.538283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.538380 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.538418 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.538450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.538471 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: E1203 14:13:54.557817 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.562883 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.562943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.562959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.562987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.563005 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: E1203 14:13:54.587610 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.592772 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.592876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.592894 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.592920 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.592936 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: E1203 14:13:54.610233 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.614565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.614616 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.614626 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.614647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.614660 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: E1203 14:13:54.630392 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:54Z is after 2025-08-24T17:21:41Z" Dec 03 14:13:54 crc kubenswrapper[4751]: E1203 14:13:54.630587 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.632686 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.632759 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.632777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.632803 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.632823 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.736137 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.736191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.736206 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.736225 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.736238 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.839554 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.839641 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.839660 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.839688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.839713 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.942709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.942777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.942794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.942820 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:54 crc kubenswrapper[4751]: I1203 14:13:54.942839 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:54Z","lastTransitionTime":"2025-12-03T14:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.045410 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.045487 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.045511 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.045541 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.045563 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.147748 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.147799 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.147811 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.147829 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.147841 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.251097 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.251158 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.251169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.251187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.251201 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.313491 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:55 crc kubenswrapper[4751]: E1203 14:13:55.313646 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.313823 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.313823 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:55 crc kubenswrapper[4751]: E1203 14:13:55.314116 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.313843 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:55 crc kubenswrapper[4751]: E1203 14:13:55.314303 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:55 crc kubenswrapper[4751]: E1203 14:13:55.314370 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.354833 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.354882 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.354896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.354914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.354924 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.458024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.458076 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.458087 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.458103 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.458114 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.561913 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.561977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.561999 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.562025 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.562042 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.665591 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.665669 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.665691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.665720 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.665742 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.768667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.768722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.768739 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.768767 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.768784 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.871522 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.871582 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.871598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.871622 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.871640 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.973905 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.973938 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.973948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.973964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:55 crc kubenswrapper[4751]: I1203 14:13:55.973974 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:55Z","lastTransitionTime":"2025-12-03T14:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.075853 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.075898 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.075909 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.075924 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.075934 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:56Z","lastTransitionTime":"2025-12-03T14:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.179155 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.179210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.179229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.179252 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.179269 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:56Z","lastTransitionTime":"2025-12-03T14:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.282340 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.282390 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.282401 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.282419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.282430 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:56Z","lastTransitionTime":"2025-12-03T14:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.384367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.384417 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.384433 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.384450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.384461 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:56Z","lastTransitionTime":"2025-12-03T14:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.487113 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.487602 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.487811 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.488075 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.488266 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:56Z","lastTransitionTime":"2025-12-03T14:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.591258 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.591320 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.591373 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.591397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.591414 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:56Z","lastTransitionTime":"2025-12-03T14:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.694075 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.694147 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.694159 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.694182 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.694196 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:56Z","lastTransitionTime":"2025-12-03T14:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.797250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.797320 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.797377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.797402 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.797422 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:56Z","lastTransitionTime":"2025-12-03T14:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.900705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.900765 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.900782 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.900804 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:56 crc kubenswrapper[4751]: I1203 14:13:56.900821 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:56Z","lastTransitionTime":"2025-12-03T14:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.004185 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.004245 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.004262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.004285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.004304 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.107167 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.107250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.107272 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.107300 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.107318 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.209752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.210052 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.210063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.210075 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.210084 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.312030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.312060 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.312068 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.312082 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.312091 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.313257 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.313277 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:57 crc kubenswrapper[4751]: E1203 14:13:57.313350 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.313434 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.313466 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:57 crc kubenswrapper[4751]: E1203 14:13:57.313536 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:13:57 crc kubenswrapper[4751]: E1203 14:13:57.313677 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:57 crc kubenswrapper[4751]: E1203 14:13:57.313726 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.414987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.415257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.415374 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.415483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.415569 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.518667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.518732 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.518746 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.518761 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.518776 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.621914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.621981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.622001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.622026 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.622049 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.725816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.725887 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.725900 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.725924 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.725948 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.829816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.829901 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.829955 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.829990 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.830010 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.839516 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:57 crc kubenswrapper[4751]: E1203 14:13:57.839728 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:57 crc kubenswrapper[4751]: E1203 14:13:57.839866 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs podName:45fb8744-4cb9-4138-8310-c02f7c6a2941 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:05.839840618 +0000 UTC m=+52.828195835 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs") pod "network-metrics-daemon-zgqdp" (UID: "45fb8744-4cb9-4138-8310-c02f7c6a2941") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.933801 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.933862 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.933872 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.933890 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:57 crc kubenswrapper[4751]: I1203 14:13:57.933903 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:57Z","lastTransitionTime":"2025-12-03T14:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.036840 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.036956 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.036973 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.036998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.037014 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.139848 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.139923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.139938 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.139963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.139977 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.242827 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.242889 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.242909 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.242979 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.242999 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.349533 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.349628 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.349647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.349676 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.349696 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.452049 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.452106 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.452121 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.452143 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.452161 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.554898 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.554940 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.554952 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.554970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.554979 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.657775 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.658137 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.658352 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.658636 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.658765 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.761870 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.761908 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.761918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.761935 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.761982 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.865231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.865271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.865282 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.865298 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.865309 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.969463 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.969553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.969580 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.969612 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:58 crc kubenswrapper[4751]: I1203 14:13:58.969647 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:58Z","lastTransitionTime":"2025-12-03T14:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.073313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.073366 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.073375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.073392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.073403 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.175699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.175740 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.175753 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.175768 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.175779 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.278386 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.278445 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.278461 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.278485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.278502 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.314153 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.314240 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.314183 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:13:59 crc kubenswrapper[4751]: E1203 14:13:59.314410 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:13:59 crc kubenswrapper[4751]: E1203 14:13:59.314536 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:13:59 crc kubenswrapper[4751]: E1203 14:13:59.314705 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.314812 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:13:59 crc kubenswrapper[4751]: E1203 14:13:59.314897 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.381367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.381419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.381428 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.381443 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.381454 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.483734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.483774 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.483783 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.483801 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.483812 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.586156 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.586211 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.586227 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.586246 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.586258 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.687888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.687944 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.687953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.687967 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.687976 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.790915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.790971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.790991 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.791013 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.791030 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.894180 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.894224 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.894235 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.894250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.894261 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.996501 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.996568 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.996581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.996597 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:13:59 crc kubenswrapper[4751]: I1203 14:13:59.996609 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:13:59Z","lastTransitionTime":"2025-12-03T14:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.099140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.099179 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.099189 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.099203 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.099215 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:00Z","lastTransitionTime":"2025-12-03T14:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.201180 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.201229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.201242 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.201259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.201271 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:00Z","lastTransitionTime":"2025-12-03T14:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.303566 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.303693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.303715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.303737 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.303754 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:00Z","lastTransitionTime":"2025-12-03T14:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.406074 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.406107 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.406115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.406127 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.406156 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:00Z","lastTransitionTime":"2025-12-03T14:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.508580 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.508647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.508662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.508680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.508691 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:00Z","lastTransitionTime":"2025-12-03T14:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.610977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.611030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.611042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.611058 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.611073 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:00Z","lastTransitionTime":"2025-12-03T14:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.713623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.713681 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.713699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.713721 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.713738 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:00Z","lastTransitionTime":"2025-12-03T14:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.816154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.816245 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.816264 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.816289 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.816306 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:00Z","lastTransitionTime":"2025-12-03T14:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.918724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.918762 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.918773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.918790 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:00 crc kubenswrapper[4751]: I1203 14:14:00.918802 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:00Z","lastTransitionTime":"2025-12-03T14:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.021259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.021299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.021310 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.021346 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.021357 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.123824 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.123864 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.123876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.123892 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.123904 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.226072 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.226118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.226131 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.226148 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.226161 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.313529 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.313649 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:01 crc kubenswrapper[4751]: E1203 14:14:01.313675 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.313723 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:01 crc kubenswrapper[4751]: E1203 14:14:01.313850 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.313700 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:01 crc kubenswrapper[4751]: E1203 14:14:01.313960 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:01 crc kubenswrapper[4751]: E1203 14:14:01.314021 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.327901 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.327938 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.327949 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.327964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.327976 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.430511 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.430549 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.430560 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.430573 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.430583 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.532976 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.533015 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.533026 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.533042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.533054 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.635832 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.635876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.635889 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.635906 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.635916 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.738387 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.738460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.738478 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.738502 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.738520 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.841057 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.841093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.841101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.841116 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.841125 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.943646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.943718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.943733 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.943750 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:01 crc kubenswrapper[4751]: I1203 14:14:01.943761 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:01Z","lastTransitionTime":"2025-12-03T14:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.046188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.046252 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.046270 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.046294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.046312 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.149092 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.149139 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.149149 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.149190 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.149201 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.251857 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.251897 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.251909 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.251925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.251938 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.314551 4751 scope.go:117] "RemoveContainer" containerID="cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.354084 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.354123 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.354133 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.354148 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.354160 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.456815 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.456855 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.456868 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.456887 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.456905 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.559847 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.559890 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.559902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.559921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.559933 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.609553 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/1.log" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.612831 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.612979 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.626558 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.638831 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.653707 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.662670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.662698 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.662707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.662721 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.662731 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.665143 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.679830 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.703290 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.719767 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.735507 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.764138 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.766039 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.766071 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.766080 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.766092 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.766101 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.774960 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.785092 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.798543 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.811277 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.824433 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.834916 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.846486 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.860541 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.868283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.868322 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.868358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.868375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.868386 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.970748 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.970780 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.970790 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.970802 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:02 crc kubenswrapper[4751]: I1203 14:14:02.970810 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:02Z","lastTransitionTime":"2025-12-03T14:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.073552 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.073589 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.073597 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.073615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.073627 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.176609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.176662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.176673 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.176690 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.176701 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.197508 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.197646 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:14:35.197620842 +0000 UTC m=+82.185976059 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.197704 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.197734 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.197790 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.197840 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:35.197830268 +0000 UTC m=+82.186185485 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.197840 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.197937 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:35.19792642 +0000 UTC m=+82.186281637 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.279489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.279539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.279551 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.279567 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.279579 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.298445 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.298505 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.298664 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.298682 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.298694 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.298730 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:35.298718116 +0000 UTC m=+82.287073333 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.298894 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.298949 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.298983 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.299074 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:35.299048855 +0000 UTC m=+82.287404102 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.313121 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.313141 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.313262 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.313372 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.313405 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.313509 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.313578 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.313686 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.327055 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.339553 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.355729 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.374637 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.381782 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.381812 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.381822 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.381836 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.381845 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.396144 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.412116 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.426516 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.447052 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.461688 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.481626 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.484447 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.484501 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.484513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.484532 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.484545 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.493752 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.503389 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.514651 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.525924 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.546984 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.559219 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.572040 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.587215 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.587295 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.587309 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.587380 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.587399 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.618376 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/2.log" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.619376 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/1.log" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.622109 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8" exitCode=1 Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.622161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.622240 4751 scope.go:117] "RemoveContainer" containerID="cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.623203 4751 scope.go:117] "RemoveContainer" containerID="0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8" Dec 03 14:14:03 crc kubenswrapper[4751]: E1203 14:14:03.623434 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.637452 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.638472 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.647108 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.661734 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.673857 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.687549 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.690296 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.690369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.690390 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.690414 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.690429 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.704389 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.725485 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.739689 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.751741 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.763606 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.777266 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.785773 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.790373 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.793116 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.793149 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.793158 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.793173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.793183 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.809496 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.824885 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.837969 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.848993 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.868251 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114803 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114811 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-djf67 in node crc\\\\nI1203 14:14:03.114812 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114830 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114824 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zgqdp] creating logical port openshift-multus_network-metrics-daemon-zgqdp for pod on switch crc\\\\nF1203 14:14:03.114833 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.881960 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.894161 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.895852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.895880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.895888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.895901 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.895910 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.906820 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.918870 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.935068 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.948910 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.959915 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.971725 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.982263 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.993294 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.998524 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.998547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.998555 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.998572 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:03 crc kubenswrapper[4751]: I1203 14:14:03.998582 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:03Z","lastTransitionTime":"2025-12-03T14:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.011414 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb4d0295b670ffaed3b769f0cbee08c5d03f697c93475dafaaa987a54eb3dd74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:13:47Z\\\",\\\"message\\\":\\\"ft-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1203 14:13:47.315381 6214 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1203 14:13:47.315387 6214 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1203 14:13:47.315388 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:13:47Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:13:47.315399 6214 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-d\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114803 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114811 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-djf67 in node crc\\\\nI1203 14:14:03.114812 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114830 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114824 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zgqdp] creating logical port openshift-multus_network-metrics-daemon-zgqdp for pod on switch crc\\\\nF1203 14:14:03.114833 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.024601 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.035144 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.053089 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.062511 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.070415 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.081777 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.094445 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.100832 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.100892 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.100910 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.100935 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.100951 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.109155 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.203452 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.203523 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.203534 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.203550 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.203660 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.305838 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.305870 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.305880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.305894 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.305904 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.407986 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.408017 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.408025 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.408038 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.408047 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.509987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.510027 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.510038 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.510053 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.510063 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.612231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.612672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.612686 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.612703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.612740 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.626233 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/2.log" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.629265 4751 scope.go:117] "RemoveContainer" containerID="0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8" Dec 03 14:14:04 crc kubenswrapper[4751]: E1203 14:14:04.629431 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.646116 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.658636 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.671250 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.682655 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.693307 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.709960 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114803 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114811 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-djf67 in node crc\\\\nI1203 14:14:03.114812 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114830 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114824 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zgqdp] creating logical port openshift-multus_network-metrics-daemon-zgqdp for pod on switch crc\\\\nF1203 14:14:03.114833 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.714387 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.714424 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.714434 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.714448 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.714459 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.720931 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.731838 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.743642 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.743689 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.743701 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.743719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.743730 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.748048 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: E1203 14:14:04.755589 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.757165 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.759394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.759426 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.759437 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.759452 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.759463 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.766467 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: E1203 14:14:04.772179 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.776053 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.776093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.776102 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.776117 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.776127 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.782255 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: E1203 14:14:04.788647 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.792601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.792653 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.792664 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.792713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.792725 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.795832 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: E1203 14:14:04.803515 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.807313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.807376 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.807386 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.807403 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.807414 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.809300 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.823128 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: E1203 14:14:04.824481 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: E1203 14:14:04.824620 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.826603 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.826641 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.826651 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.826670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.826686 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.833475 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.846644 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.860947 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.929468 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.929529 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.929547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.929571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:04 crc kubenswrapper[4751]: I1203 14:14:04.929587 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:04Z","lastTransitionTime":"2025-12-03T14:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.033149 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.033194 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.033205 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.033221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.033231 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.135472 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.135520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.135540 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.135562 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.135578 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.237793 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.237828 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.237838 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.237851 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.237861 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.313922 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.313957 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.313995 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.313913 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:05 crc kubenswrapper[4751]: E1203 14:14:05.314068 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:05 crc kubenswrapper[4751]: E1203 14:14:05.314136 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:05 crc kubenswrapper[4751]: E1203 14:14:05.314281 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:05 crc kubenswrapper[4751]: E1203 14:14:05.314391 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.341360 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.341406 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.341419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.341435 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.341447 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.444174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.444253 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.444267 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.444282 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.444295 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.547063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.547108 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.547119 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.547134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.547145 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.649403 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.649453 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.649466 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.649483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.649496 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.751928 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.751972 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.751980 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.751995 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.752005 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.855187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.855258 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.855283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.855313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.855362 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.922246 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:05 crc kubenswrapper[4751]: E1203 14:14:05.922645 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:14:05 crc kubenswrapper[4751]: E1203 14:14:05.922812 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs podName:45fb8744-4cb9-4138-8310-c02f7c6a2941 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:21.922772519 +0000 UTC m=+68.911127776 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs") pod "network-metrics-daemon-zgqdp" (UID: "45fb8744-4cb9-4138-8310-c02f7c6a2941") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.958750 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.958825 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.958846 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.958874 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:05 crc kubenswrapper[4751]: I1203 14:14:05.958892 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:05Z","lastTransitionTime":"2025-12-03T14:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.061412 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.061467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.061482 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.061502 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.061518 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:06Z","lastTransitionTime":"2025-12-03T14:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.164024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.164070 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.164079 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.164093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.164105 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:06Z","lastTransitionTime":"2025-12-03T14:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.267828 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.267904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.267929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.267984 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.268012 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:06Z","lastTransitionTime":"2025-12-03T14:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.371224 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.371297 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.371307 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.371355 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.371370 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:06Z","lastTransitionTime":"2025-12-03T14:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.486617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.486646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.486654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.486667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.486675 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:06Z","lastTransitionTime":"2025-12-03T14:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.588802 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.588864 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.588881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.588904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.588922 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:06Z","lastTransitionTime":"2025-12-03T14:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.690713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.690737 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.690745 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.690758 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.690766 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:06Z","lastTransitionTime":"2025-12-03T14:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.792840 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.793439 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.793547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.793634 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.793710 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:06Z","lastTransitionTime":"2025-12-03T14:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.897489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.897558 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.897570 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.897595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:06 crc kubenswrapper[4751]: I1203 14:14:06.897612 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:06Z","lastTransitionTime":"2025-12-03T14:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.000061 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.000113 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.000128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.000146 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.000161 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.102784 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.102846 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.102866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.102889 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.102908 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.205203 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.205242 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.205253 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.205268 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.205280 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.308239 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.308280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.308291 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.308316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.308371 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.313930 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.313978 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.313944 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.313930 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:07 crc kubenswrapper[4751]: E1203 14:14:07.314099 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:07 crc kubenswrapper[4751]: E1203 14:14:07.314199 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:07 crc kubenswrapper[4751]: E1203 14:14:07.314314 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:07 crc kubenswrapper[4751]: E1203 14:14:07.314426 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.411483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.411554 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.411573 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.411601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.411621 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.514037 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.514077 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.514088 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.514103 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.514114 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.617997 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.618160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.618194 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.618227 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.618254 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.720843 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.720929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.720944 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.720963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.720976 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.826371 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.826456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.826474 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.826527 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.826544 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.929732 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.930001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.930086 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.930205 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:07 crc kubenswrapper[4751]: I1203 14:14:07.930298 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:07Z","lastTransitionTime":"2025-12-03T14:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.034256 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.034314 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.034361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.034392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.034411 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.138825 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.138879 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.138895 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.138921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.138937 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.242506 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.242559 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.242570 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.242586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.242599 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.346446 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.346509 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.346531 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.346558 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.346578 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.449971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.450055 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.450070 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.450092 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.450106 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.554214 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.554269 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.554288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.554313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.554346 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.657296 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.657375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.657389 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.657408 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.657421 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.759810 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.759881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.759902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.759940 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.759959 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.862638 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.862706 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.862725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.862751 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.862769 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.965747 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.965783 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.965793 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.965806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:08 crc kubenswrapper[4751]: I1203 14:14:08.965815 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:08Z","lastTransitionTime":"2025-12-03T14:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.068233 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.068285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.068295 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.068307 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.068342 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.170611 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.170647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.170655 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.170668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.170676 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.273207 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.273312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.273365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.273407 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.273428 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.313075 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.313075 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.313120 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:09 crc kubenswrapper[4751]: E1203 14:14:09.313450 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.313140 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:09 crc kubenswrapper[4751]: E1203 14:14:09.313686 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:09 crc kubenswrapper[4751]: E1203 14:14:09.313567 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:09 crc kubenswrapper[4751]: E1203 14:14:09.314017 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.376445 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.376483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.376492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.376507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.376517 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.479560 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.479637 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.479655 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.479683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.479703 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.583043 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.583134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.583160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.583191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.583215 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.685518 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.685583 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.685593 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.685608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.685622 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.787833 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.787895 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.787904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.787926 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.787938 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.890661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.890737 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.890746 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.890760 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.890769 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.993097 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.993151 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.993163 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.993178 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:09 crc kubenswrapper[4751]: I1203 14:14:09.993186 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:09Z","lastTransitionTime":"2025-12-03T14:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.096713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.096774 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.096789 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.096811 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.096826 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:10Z","lastTransitionTime":"2025-12-03T14:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.200534 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.200598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.200615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.200639 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.200660 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:10Z","lastTransitionTime":"2025-12-03T14:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.304229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.304358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.304390 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.304425 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.304453 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:10Z","lastTransitionTime":"2025-12-03T14:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.408367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.408436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.408456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.408481 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.408502 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:10Z","lastTransitionTime":"2025-12-03T14:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.510952 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.511028 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.511046 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.511073 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.511090 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:10Z","lastTransitionTime":"2025-12-03T14:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.614123 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.614237 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.614259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.614286 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.614304 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:10Z","lastTransitionTime":"2025-12-03T14:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.716861 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.716913 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.716930 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.716951 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.716967 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:10Z","lastTransitionTime":"2025-12-03T14:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.820182 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.820257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.820275 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.820300 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.820320 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:10Z","lastTransitionTime":"2025-12-03T14:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.923611 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.923672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.923689 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.923715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:10 crc kubenswrapper[4751]: I1203 14:14:10.923737 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:10Z","lastTransitionTime":"2025-12-03T14:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.026802 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.026852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.026863 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.026880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.026899 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.130281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.130365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.130384 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.130407 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.130423 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.233485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.233535 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.233546 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.233563 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.233574 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.313662 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.313702 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.313765 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:11 crc kubenswrapper[4751]: E1203 14:14:11.313805 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.313860 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:11 crc kubenswrapper[4751]: E1203 14:14:11.313897 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:11 crc kubenswrapper[4751]: E1203 14:14:11.313970 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:11 crc kubenswrapper[4751]: E1203 14:14:11.314111 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.335425 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.335463 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.335474 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.335487 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.335496 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.437790 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.437885 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.437898 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.437916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.437927 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.540608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.540673 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.540691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.540722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.540747 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.643519 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.643563 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.643577 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.643595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.643606 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.746279 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.746317 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.746350 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.746367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.746377 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.849408 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.849456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.849471 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.849492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.849506 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.952038 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.952107 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.952126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.952154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:11 crc kubenswrapper[4751]: I1203 14:14:11.952187 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:11Z","lastTransitionTime":"2025-12-03T14:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.055027 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.055070 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.055078 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.055095 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.055105 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.157612 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.157669 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.157686 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.157707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.157722 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.260168 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.260210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.260220 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.260238 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.260250 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.362110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.362143 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.362153 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.362167 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.362178 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.464979 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.465012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.465021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.465034 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.465044 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.567689 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.567733 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.567745 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.567761 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.567773 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.669590 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.669651 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.669668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.669696 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.669713 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.771836 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.772289 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.772494 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.772659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.772799 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.875584 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.875644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.875659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.875684 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.875707 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.978540 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.978585 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.978601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.978623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:12 crc kubenswrapper[4751]: I1203 14:14:12.978641 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:12Z","lastTransitionTime":"2025-12-03T14:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.080752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.080794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.080806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.080821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.080831 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:13Z","lastTransitionTime":"2025-12-03T14:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.183595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.183969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.184157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.184426 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.184657 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:13Z","lastTransitionTime":"2025-12-03T14:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.286209 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.286238 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.286246 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.286257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.286266 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:13Z","lastTransitionTime":"2025-12-03T14:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.313341 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.313382 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.313438 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:13 crc kubenswrapper[4751]: E1203 14:14:13.313642 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.313698 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:13 crc kubenswrapper[4751]: E1203 14:14:13.313883 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:13 crc kubenswrapper[4751]: E1203 14:14:13.313916 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:13 crc kubenswrapper[4751]: E1203 14:14:13.313985 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.327071 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.343952 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.358864 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.369954 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.383016 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.390092 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.390155 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.390179 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.390210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.390278 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:13Z","lastTransitionTime":"2025-12-03T14:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.398847 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.417865 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.428808 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.445552 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.460644 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.477869 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.493966 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.494018 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.494036 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.494060 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.494080 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:13Z","lastTransitionTime":"2025-12-03T14:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.503403 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114803 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114811 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-djf67 in node crc\\\\nI1203 14:14:03.114812 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114830 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114824 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zgqdp] creating logical port openshift-multus_network-metrics-daemon-zgqdp for pod on switch crc\\\\nF1203 14:14:03.114833 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.515871 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.533832 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.557217 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.575184 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.588524 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.596280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.596432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.596530 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.596618 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.596705 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:13Z","lastTransitionTime":"2025-12-03T14:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.612295 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:13Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.700036 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.700070 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.700098 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.700111 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.700122 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:13Z","lastTransitionTime":"2025-12-03T14:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.802710 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.802791 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.802818 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.802847 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.802868 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:13Z","lastTransitionTime":"2025-12-03T14:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.905405 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.905467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.905490 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.905521 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:13 crc kubenswrapper[4751]: I1203 14:14:13.905543 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:13Z","lastTransitionTime":"2025-12-03T14:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.008343 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.008379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.008388 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.008400 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.008410 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.111109 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.111375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.111408 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.111437 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.111454 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.214797 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.214869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.214893 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.214922 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.214944 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.317532 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.317583 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.317600 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.317622 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.317634 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.419944 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.419981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.419990 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.420009 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.420017 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.522282 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.522314 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.522333 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.522346 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.522355 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.624703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.624760 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.624775 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.624795 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.624811 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.727144 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.727172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.727180 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.727194 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.727206 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.829577 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.829884 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.829896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.829910 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.829920 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.932582 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.932621 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.932632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.932650 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:14 crc kubenswrapper[4751]: I1203 14:14:14.932661 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:14Z","lastTransitionTime":"2025-12-03T14:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.037044 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.037089 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.037101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.037117 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.037132 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.140053 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.140097 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.140109 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.140125 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.140136 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.141046 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.141103 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.141114 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.141132 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.141143 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.155238 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:15Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.159377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.159410 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.159420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.159434 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.159443 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.175595 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:15Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.179744 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.179798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.179807 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.179819 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.179829 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.195512 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:15Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.200633 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.200672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.200688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.200710 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.200727 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.213391 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:15Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.217591 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.217642 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.217652 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.217665 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.217675 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.230404 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:15Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.230522 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.242652 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.242704 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.242724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.242750 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.242773 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.313968 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.314073 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.314216 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.314257 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.314374 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.314416 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.314986 4751 scope.go:117] "RemoveContainer" containerID="0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8" Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.315172 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.315279 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:15 crc kubenswrapper[4751]: E1203 14:14:15.315346 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.345487 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.345530 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.345539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.345557 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.345567 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.449011 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.449064 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.449082 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.449108 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.449128 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.551981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.552029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.552042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.552062 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.552076 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.655991 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.656033 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.656043 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.656063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.656077 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.757790 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.757819 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.757827 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.757839 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.757847 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.860397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.860436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.860446 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.860462 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.860474 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.962646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.962690 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.962703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.962739 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:15 crc kubenswrapper[4751]: I1203 14:14:15.962751 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:15Z","lastTransitionTime":"2025-12-03T14:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.065715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.066206 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.066609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.066799 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.066967 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.170004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.170419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.170639 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.170774 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.170891 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.273504 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.273888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.273968 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.274033 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.274096 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.377398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.377484 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.377528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.377559 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.377582 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.480641 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.480692 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.480704 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.480722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.480734 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.583221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.583259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.583270 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.583287 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.583300 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.685403 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.685440 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.685449 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.685463 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.685472 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.788522 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.788578 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.788595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.788623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.788644 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.890980 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.891032 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.891046 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.891070 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.891088 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.993781 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.993837 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.993850 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.993869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:16 crc kubenswrapper[4751]: I1203 14:14:16.993879 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:16Z","lastTransitionTime":"2025-12-03T14:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.096089 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.096155 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.096173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.096202 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.096224 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:17Z","lastTransitionTime":"2025-12-03T14:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.199260 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.199409 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.199437 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.199461 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.199476 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:17Z","lastTransitionTime":"2025-12-03T14:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.302593 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.302888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.302963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.303032 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.303086 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:17Z","lastTransitionTime":"2025-12-03T14:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.313955 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.313999 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.314033 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:17 crc kubenswrapper[4751]: E1203 14:14:17.314090 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:17 crc kubenswrapper[4751]: E1203 14:14:17.314193 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.314201 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:17 crc kubenswrapper[4751]: E1203 14:14:17.314272 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:17 crc kubenswrapper[4751]: E1203 14:14:17.314449 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.406647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.406735 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.406760 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.406794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.406818 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:17Z","lastTransitionTime":"2025-12-03T14:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.509218 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.509276 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.509299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.509376 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.509404 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:17Z","lastTransitionTime":"2025-12-03T14:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.612724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.612774 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.612785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.612801 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.612811 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:17Z","lastTransitionTime":"2025-12-03T14:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.714294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.714352 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.714361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.714374 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.714384 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:17Z","lastTransitionTime":"2025-12-03T14:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.834356 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.834419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.834432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.834454 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.834470 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:17Z","lastTransitionTime":"2025-12-03T14:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.937590 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.937635 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.937647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.937662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:17 crc kubenswrapper[4751]: I1203 14:14:17.937673 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:17Z","lastTransitionTime":"2025-12-03T14:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.040977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.041031 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.041041 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.041062 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.041074 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.145039 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.145469 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.145604 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.145697 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.145795 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.248830 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.248884 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.248897 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.248916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.248927 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.351130 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.351199 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.351221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.351250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.351270 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.454137 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.454208 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.454226 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.454254 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.454272 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.558183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.558271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.558293 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.558322 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.558437 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.661467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.661503 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.661513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.661527 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.661537 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.763688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.763747 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.763795 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.763816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.763831 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.865371 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.865419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.865428 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.865440 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.865449 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.968578 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.968630 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.968643 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.968661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:18 crc kubenswrapper[4751]: I1203 14:14:18.968672 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:18Z","lastTransitionTime":"2025-12-03T14:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.071669 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.071744 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.071809 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.071845 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.071869 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.173875 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.173904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.173912 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.173925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.173934 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.276317 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.276366 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.276385 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.276406 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.276417 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.313273 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.313370 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.313415 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:19 crc kubenswrapper[4751]: E1203 14:14:19.313413 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.313285 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:19 crc kubenswrapper[4751]: E1203 14:14:19.313496 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:19 crc kubenswrapper[4751]: E1203 14:14:19.313584 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:19 crc kubenswrapper[4751]: E1203 14:14:19.313649 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.378483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.378518 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.378529 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.378544 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.378556 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.480840 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.480878 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.480892 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.480907 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.480920 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.583032 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.583067 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.583075 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.583088 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.583098 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.685339 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.685381 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.685392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.685406 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.685415 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.788283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.788346 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.788359 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.788378 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.788390 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.890259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.890298 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.890309 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.890345 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.890359 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.992866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.992911 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.992923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.992941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:19 crc kubenswrapper[4751]: I1203 14:14:19.992953 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:19Z","lastTransitionTime":"2025-12-03T14:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.096356 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.096443 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.096453 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.096469 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.096481 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:20Z","lastTransitionTime":"2025-12-03T14:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.198958 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.199021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.199035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.199057 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.199073 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:20Z","lastTransitionTime":"2025-12-03T14:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.302598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.302642 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.302661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.302678 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.302688 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:20Z","lastTransitionTime":"2025-12-03T14:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.406966 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.407005 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.407015 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.407030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.407040 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:20Z","lastTransitionTime":"2025-12-03T14:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.509645 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.509674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.509683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.509696 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.509706 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:20Z","lastTransitionTime":"2025-12-03T14:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.612358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.612385 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.612394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.612407 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.612415 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:20Z","lastTransitionTime":"2025-12-03T14:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.714648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.714680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.714693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.714709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.714720 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:20Z","lastTransitionTime":"2025-12-03T14:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.820251 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.820285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.820294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.820310 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.820322 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:20Z","lastTransitionTime":"2025-12-03T14:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.923066 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.923129 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.923150 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.923182 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:20 crc kubenswrapper[4751]: I1203 14:14:20.923206 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:20Z","lastTransitionTime":"2025-12-03T14:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.026572 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.026636 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.026650 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.026672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.026687 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.130406 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.130485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.130504 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.130528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.130546 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.234448 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.234491 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.234502 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.234529 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.234545 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.313076 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:21 crc kubenswrapper[4751]: E1203 14:14:21.313599 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.313658 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:21 crc kubenswrapper[4751]: E1203 14:14:21.313905 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.314043 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:21 crc kubenswrapper[4751]: E1203 14:14:21.314161 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.314178 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:21 crc kubenswrapper[4751]: E1203 14:14:21.314361 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.337964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.338019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.338036 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.338058 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.338073 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.441045 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.441126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.441138 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.441154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.441164 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.545749 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.545838 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.545861 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.545888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.545906 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.647906 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.647936 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.647945 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.647960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.647969 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.751467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.751521 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.751531 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.751549 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.751559 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.854101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.854144 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.854156 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.854174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.854185 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.956748 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.956795 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.956805 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.956824 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:21 crc kubenswrapper[4751]: I1203 14:14:21.956836 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:21Z","lastTransitionTime":"2025-12-03T14:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.000867 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:22 crc kubenswrapper[4751]: E1203 14:14:22.001077 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:14:22 crc kubenswrapper[4751]: E1203 14:14:22.001215 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs podName:45fb8744-4cb9-4138-8310-c02f7c6a2941 nodeName:}" failed. No retries permitted until 2025-12-03 14:14:54.001179918 +0000 UTC m=+100.989535195 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs") pod "network-metrics-daemon-zgqdp" (UID: "45fb8744-4cb9-4138-8310-c02f7c6a2941") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.059421 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.059463 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.059474 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.059492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.059505 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.162241 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.162283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.162301 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.162358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.162375 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.267524 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.267595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.267604 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.267620 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.267630 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.370570 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.370606 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.370615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.370628 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.370638 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.473413 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.473450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.473463 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.473480 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.473492 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.576738 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.576782 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.576791 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.576808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.576816 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.678901 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.678947 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.678960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.678977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.678988 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.781308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.781432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.781456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.781479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.781496 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.884475 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.884520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.884529 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.884546 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.884557 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.987054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.987093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.987102 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.987115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:22 crc kubenswrapper[4751]: I1203 14:14:22.987126 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:22Z","lastTransitionTime":"2025-12-03T14:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.090383 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.090536 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.090552 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.090571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.090583 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:23Z","lastTransitionTime":"2025-12-03T14:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.192719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.192757 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.192764 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.192778 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.192787 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:23Z","lastTransitionTime":"2025-12-03T14:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.295705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.295785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.295802 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.295826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.295844 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:23Z","lastTransitionTime":"2025-12-03T14:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.313111 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.313116 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.313170 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.313207 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:23 crc kubenswrapper[4751]: E1203 14:14:23.313392 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:23 crc kubenswrapper[4751]: E1203 14:14:23.313823 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:23 crc kubenswrapper[4751]: E1203 14:14:23.313957 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:23 crc kubenswrapper[4751]: E1203 14:14:23.314045 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.323358 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.337736 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.349044 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.360884 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.388546 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.397612 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.397839 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.397962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.398094 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.398225 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:23Z","lastTransitionTime":"2025-12-03T14:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.406885 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.422341 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.439388 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.450160 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.460202 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.471028 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.486297 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.499191 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.501090 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.501179 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.501194 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.501227 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.501238 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:23Z","lastTransitionTime":"2025-12-03T14:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.527478 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114803 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114811 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-djf67 in node crc\\\\nI1203 14:14:03.114812 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114830 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114824 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zgqdp] creating logical port openshift-multus_network-metrics-daemon-zgqdp for pod on switch crc\\\\nF1203 14:14:03.114833 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.548925 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.568697 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.586036 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.604404 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.604456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.604465 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.604481 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.604492 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:23Z","lastTransitionTime":"2025-12-03T14:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.604909 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.688209 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98mjq_6a216adb-632d-4134-8c61-61fe6b8c5f71/kube-multus/0.log" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.688276 4751 generic.go:334] "Generic (PLEG): container finished" podID="6a216adb-632d-4134-8c61-61fe6b8c5f71" containerID="32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815" exitCode=1 Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.688312 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98mjq" event={"ID":"6a216adb-632d-4134-8c61-61fe6b8c5f71","Type":"ContainerDied","Data":"32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.688764 4751 scope.go:117] "RemoveContainer" containerID="32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.702811 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.706233 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.706283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.706293 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.706314 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.706344 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:23Z","lastTransitionTime":"2025-12-03T14:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.720374 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.734305 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.746752 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.760866 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.773388 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.790192 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.801863 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.808449 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.808490 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.808501 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.808520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.808532 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:23Z","lastTransitionTime":"2025-12-03T14:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.814221 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.824433 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.836317 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.853985 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114803 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114811 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-djf67 in node crc\\\\nI1203 14:14:03.114812 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114830 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114824 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zgqdp] creating logical port openshift-multus_network-metrics-daemon-zgqdp for pod on switch crc\\\\nF1203 14:14:03.114833 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.865775 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.888013 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.911101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.911134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.911143 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.911156 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.911166 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:23Z","lastTransitionTime":"2025-12-03T14:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.921505 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.934053 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.946843 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:23 crc kubenswrapper[4751]: I1203 14:14:23.961816 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:22Z\\\",\\\"message\\\":\\\"2025-12-03T14:13:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e\\\\n2025-12-03T14:13:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e to /host/opt/cni/bin/\\\\n2025-12-03T14:13:37Z [verbose] multus-daemon started\\\\n2025-12-03T14:13:37Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:14:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:23Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.013388 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.013632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.013706 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.013769 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.013823 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.116934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.116972 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.116981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.116996 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.117006 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.219024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.219055 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.219063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.219078 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.219086 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.320779 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.320806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.320813 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.320825 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.320834 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.422960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.422998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.423009 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.423024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.423035 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.525425 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.525466 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.525477 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.525494 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.525504 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.627887 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.627931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.627941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.627977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.627989 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.693563 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98mjq_6a216adb-632d-4134-8c61-61fe6b8c5f71/kube-multus/0.log" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.693640 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98mjq" event={"ID":"6a216adb-632d-4134-8c61-61fe6b8c5f71","Type":"ContainerStarted","Data":"f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.707392 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.719711 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.730178 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.730230 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.730244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.730262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.730275 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.732571 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.745117 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.760638 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.773792 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.786144 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.805930 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114803 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114811 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-djf67 in node crc\\\\nI1203 14:14:03.114812 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114830 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114824 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zgqdp] creating logical port openshift-multus_network-metrics-daemon-zgqdp for pod on switch crc\\\\nF1203 14:14:03.114833 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.819397 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.831441 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.832429 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.832453 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.832462 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.832477 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.832487 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.842613 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.853561 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.863317 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.876230 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:22Z\\\",\\\"message\\\":\\\"2025-12-03T14:13:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e\\\\n2025-12-03T14:13:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e to /host/opt/cni/bin/\\\\n2025-12-03T14:13:37Z [verbose] multus-daemon started\\\\n2025-12-03T14:13:37Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:14:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.887045 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.900252 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.926811 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.935557 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.935612 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.935632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.935654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.935673 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:24Z","lastTransitionTime":"2025-12-03T14:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:24 crc kubenswrapper[4751]: I1203 14:14:24.940948 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:24Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.038227 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.038275 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.038291 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.038308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.038337 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.140576 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.140631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.140643 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.140660 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.140674 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.242975 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.243042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.243065 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.243096 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.243117 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.313685 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.313734 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.313738 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.313892 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.313966 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.314136 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.314196 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.314653 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.346058 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.346093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.346122 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.346140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.346153 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.448932 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.448980 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.448992 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.449010 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.449022 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.551478 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.551542 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.551561 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.551585 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.551604 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.591871 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.591937 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.591950 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.591966 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.591978 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.607632 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.611666 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.611719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.611732 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.611753 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.611765 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.625542 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.629648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.629689 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.629701 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.629718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.629732 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.644319 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.653360 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.653576 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.653641 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.653701 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.653761 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.671107 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.674903 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.674953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.674971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.674994 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.675012 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.692731 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:25Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:25 crc kubenswrapper[4751]: E1203 14:14:25.693160 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.695071 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.695098 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.695107 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.695121 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.695132 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.797424 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.797464 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.797474 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.797493 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.797505 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.899467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.899511 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.899525 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.899541 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:25 crc kubenswrapper[4751]: I1203 14:14:25.899554 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:25Z","lastTransitionTime":"2025-12-03T14:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.001948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.002022 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.002044 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.002071 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.002092 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.105272 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.105349 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.105367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.105393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.105411 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.208451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.208520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.208537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.208560 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.208577 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.310960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.310995 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.311006 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.311022 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.311034 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.413259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.413352 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.413372 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.413396 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.413414 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.516573 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.516649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.516701 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.516723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.516740 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.619556 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.619588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.619596 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.619608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.619617 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.721571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.721623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.721636 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.721654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.721666 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.824180 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.824241 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.824298 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.824370 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.824390 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.927788 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.927862 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.927895 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.927925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:26 crc kubenswrapper[4751]: I1203 14:14:26.927948 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:26Z","lastTransitionTime":"2025-12-03T14:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.030761 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.030898 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.030917 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.030944 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.030965 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.134303 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.134362 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.134373 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.134389 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.134400 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.238605 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.238649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.238662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.238683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.238696 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.313237 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.313242 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.313321 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:27 crc kubenswrapper[4751]: E1203 14:14:27.313458 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.313586 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:27 crc kubenswrapper[4751]: E1203 14:14:27.313626 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:27 crc kubenswrapper[4751]: E1203 14:14:27.313781 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:27 crc kubenswrapper[4751]: E1203 14:14:27.313813 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.341565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.341595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.341603 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.341617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.341628 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.444598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.444658 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.444681 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.444709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.444731 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.547004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.547071 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.547089 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.547114 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.547135 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.650246 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.650285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.650294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.650307 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.650316 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.753213 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.753243 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.753253 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.753271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.753283 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.856547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.856587 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.856600 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.856617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.856629 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.959625 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.959728 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.959757 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.959795 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:27 crc kubenswrapper[4751]: I1203 14:14:27.959823 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:27Z","lastTransitionTime":"2025-12-03T14:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.062504 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.062586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.062602 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.062623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.062637 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.164735 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.164776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.164787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.164803 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.164816 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.267114 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.267154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.267183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.267200 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.267209 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.314572 4751 scope.go:117] "RemoveContainer" containerID="0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.371174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.371229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.371246 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.371270 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.371290 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.474255 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.474577 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.474668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.474750 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.474823 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.576985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.577022 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.577031 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.577046 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.577055 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.679738 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.679776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.679789 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.679806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.679817 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.707272 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/2.log" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.709206 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.709924 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.728269 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.757615 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.769067 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.781546 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.781715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.781752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.781762 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.781778 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.781788 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.800535 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114803 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114811 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-djf67 in node crc\\\\nI1203 14:14:03.114812 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114830 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114824 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zgqdp] creating logical port openshift-multus_network-metrics-daemon-zgqdp for pod on switch crc\\\\nF1203 14:14:03.114833 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.813987 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.823502 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.841372 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.851117 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.860582 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.873387 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:22Z\\\",\\\"message\\\":\\\"2025-12-03T14:13:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e\\\\n2025-12-03T14:13:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e to /host/opt/cni/bin/\\\\n2025-12-03T14:13:37Z [verbose] multus-daemon started\\\\n2025-12-03T14:13:37Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:14:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.883708 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.883753 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.883765 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.883783 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.883795 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.884210 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.897095 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.907195 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.916798 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.926729 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.938344 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.950197 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:28Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.986468 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.986510 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.986520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.986537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:28 crc kubenswrapper[4751]: I1203 14:14:28.986549 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:28Z","lastTransitionTime":"2025-12-03T14:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.089496 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.089565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.089582 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.089605 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.089623 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:29Z","lastTransitionTime":"2025-12-03T14:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.192658 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.192731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.192757 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.192786 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.192807 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:29Z","lastTransitionTime":"2025-12-03T14:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.295291 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.295398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.295410 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.295423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.295432 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:29Z","lastTransitionTime":"2025-12-03T14:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.313766 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.313848 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:29 crc kubenswrapper[4751]: E1203 14:14:29.313902 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.313971 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.313855 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:29 crc kubenswrapper[4751]: E1203 14:14:29.314051 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:29 crc kubenswrapper[4751]: E1203 14:14:29.314201 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:29 crc kubenswrapper[4751]: E1203 14:14:29.314353 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.398543 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.398601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.398620 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.398644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.398664 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:29Z","lastTransitionTime":"2025-12-03T14:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.501613 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.501662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.501680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.501704 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.501723 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:29Z","lastTransitionTime":"2025-12-03T14:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.604377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.604446 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.604501 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.604529 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.604546 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:29Z","lastTransitionTime":"2025-12-03T14:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.706959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.707031 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.707048 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.707072 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.707089 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:29Z","lastTransitionTime":"2025-12-03T14:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.714988 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/3.log" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.715886 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/2.log" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.719410 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f" exitCode=1 Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.719452 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.719487 4751 scope.go:117] "RemoveContainer" containerID="0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.720553 4751 scope.go:117] "RemoveContainer" containerID="fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f" Dec 03 14:14:29 crc kubenswrapper[4751]: E1203 14:14:29.720824 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.737091 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.761100 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.773876 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.787471 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.803300 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.809833 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.809878 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.809893 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.809913 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.809929 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:29Z","lastTransitionTime":"2025-12-03T14:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.818950 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.841037 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.858952 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.876120 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.889371 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.900851 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.912992 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.913063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.913092 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.913127 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.913153 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:29Z","lastTransitionTime":"2025-12-03T14:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.922139 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea030d283d61459315d72845f04d34c1ad21974731a82902ce65b328c0f7fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114803 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-djf67\\\\nI1203 14:14:03.114811 6433 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-djf67 in node crc\\\\nI1203 14:14:03.114812 6433 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114830 6433 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-t8q27\\\\nI1203 14:14:03.114824 6433 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zgqdp] creating logical port openshift-multus_network-metrics-daemon-zgqdp for pod on switch crc\\\\nF1203 14:14:03.114833 6433 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:29Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:14:29.102796 6794 services_controller.go:434] Service openshift-kube-scheduler-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-scheduler-operator 760c7338-f39e-4136-9d29-d6fccbd607c1 4364 0 2025-02-23 05:12:18 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-kube-scheduler-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:kube-scheduler-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756b94b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.934490 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.947578 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.968909 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.981766 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:29 crc kubenswrapper[4751]: I1203 14:14:29.995508 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.010876 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:22Z\\\",\\\"message\\\":\\\"2025-12-03T14:13:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e\\\\n2025-12-03T14:13:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e to /host/opt/cni/bin/\\\\n2025-12-03T14:13:37Z [verbose] multus-daemon started\\\\n2025-12-03T14:13:37Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:14:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.015728 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.015774 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.015785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.015803 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.015815 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.118909 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.118971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.118993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.119021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.119040 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.221435 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.221482 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.221492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.221507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.221517 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.323918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.323948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.323957 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.323970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.323979 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.426609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.426650 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.426662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.426677 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.426691 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.529779 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.529837 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.529854 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.529875 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.529888 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.633249 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.633305 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.633360 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.633384 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.633402 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.724453 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/3.log" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.728774 4751 scope.go:117] "RemoveContainer" containerID="fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f" Dec 03 14:14:30 crc kubenswrapper[4751]: E1203 14:14:30.729010 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.736501 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.736571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.736631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.736661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.736685 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.756034 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.774404 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.804409 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:29Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:14:29.102796 6794 services_controller.go:434] Service openshift-kube-scheduler-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-scheduler-operator 760c7338-f39e-4136-9d29-d6fccbd607c1 4364 0 2025-02-23 05:12:18 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-kube-scheduler-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:kube-scheduler-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756b94b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.826445 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.840935 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.840988 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.841002 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.841023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.841042 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.847593 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.868764 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.877662 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.886210 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.899235 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:22Z\\\",\\\"message\\\":\\\"2025-12-03T14:13:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e\\\\n2025-12-03T14:13:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e to /host/opt/cni/bin/\\\\n2025-12-03T14:13:37Z [verbose] multus-daemon started\\\\n2025-12-03T14:13:37Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:14:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.911222 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.927606 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.943142 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.943181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.943192 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.943209 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.943220 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:30Z","lastTransitionTime":"2025-12-03T14:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.947863 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.960484 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.975176 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:30 crc kubenswrapper[4751]: I1203 14:14:30.993530 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:30Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.004529 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.015613 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.028236 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:31Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.046271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.046313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.046359 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.046379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.046392 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.148971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.149406 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.149581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.149738 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.149879 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.252923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.253672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.253823 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.254012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.254144 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.313734 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:31 crc kubenswrapper[4751]: E1203 14:14:31.313898 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.313946 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.313976 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.313734 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:31 crc kubenswrapper[4751]: E1203 14:14:31.314040 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:31 crc kubenswrapper[4751]: E1203 14:14:31.314139 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:31 crc kubenswrapper[4751]: E1203 14:14:31.314234 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.356605 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.356654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.356700 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.356720 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.356733 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.459300 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.459368 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.459384 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.459405 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.459420 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.561956 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.561994 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.562004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.562019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.562029 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.666123 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.666201 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.666228 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.666259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.666284 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.769918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.769979 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.770002 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.770030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.770052 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.873656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.873718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.873742 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.873773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.873796 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.977420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.977503 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.977529 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.977563 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:31 crc kubenswrapper[4751]: I1203 14:14:31.977589 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:31Z","lastTransitionTime":"2025-12-03T14:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.079971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.080023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.080040 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.080061 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.080077 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:32Z","lastTransitionTime":"2025-12-03T14:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.182734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.182988 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.183111 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.183271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.183432 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:32Z","lastTransitionTime":"2025-12-03T14:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.286065 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.286319 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.286448 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.286548 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.286633 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:32Z","lastTransitionTime":"2025-12-03T14:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.389152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.389182 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.389191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.389204 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.389216 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:32Z","lastTransitionTime":"2025-12-03T14:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.492608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.492649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.492662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.492683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.492697 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:32Z","lastTransitionTime":"2025-12-03T14:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.595162 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.595208 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.595232 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.595246 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.595260 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:32Z","lastTransitionTime":"2025-12-03T14:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.698371 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.698420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.698433 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.698450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.698463 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:32Z","lastTransitionTime":"2025-12-03T14:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.801563 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.801654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.801679 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.801710 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.801734 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:32Z","lastTransitionTime":"2025-12-03T14:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.905382 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.905441 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.905459 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.905487 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:32 crc kubenswrapper[4751]: I1203 14:14:32.905508 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:32Z","lastTransitionTime":"2025-12-03T14:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.008280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.008378 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.008402 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.008436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.008461 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.111662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.111742 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.111766 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.111795 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.111818 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.215052 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.215090 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.215099 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.215115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.215125 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.313252 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.313310 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.313431 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:33 crc kubenswrapper[4751]: E1203 14:14:33.313567 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.313654 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:33 crc kubenswrapper[4751]: E1203 14:14:33.313855 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:33 crc kubenswrapper[4751]: E1203 14:14:33.313977 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:33 crc kubenswrapper[4751]: E1203 14:14:33.314224 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.319864 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.319933 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.319953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.319976 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.319993 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.337000 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.358019 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.375064 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.396456 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.422319 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.422454 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.422483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.422516 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.422540 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.427764 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:29Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:14:29.102796 6794 services_controller.go:434] Service openshift-kube-scheduler-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-scheduler-operator 760c7338-f39e-4136-9d29-d6fccbd607c1 4364 0 2025-02-23 05:12:18 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-kube-scheduler-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:kube-scheduler-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756b94b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.447067 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.468694 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.493839 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.509173 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.525848 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.526017 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.526067 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.526287 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.526373 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.526440 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.540213 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:22Z\\\",\\\"message\\\":\\\"2025-12-03T14:13:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e\\\\n2025-12-03T14:13:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e to /host/opt/cni/bin/\\\\n2025-12-03T14:13:37Z [verbose] multus-daemon started\\\\n2025-12-03T14:13:37Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:14:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.553650 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.573790 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.595905 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.609934 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.628684 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.629569 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.629618 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.629630 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.629649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.629662 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.647399 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.663269 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:33Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.733135 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.733212 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.733236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.733266 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.733288 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.836442 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.836513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.836534 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.836563 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.836580 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.939454 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.939525 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.939548 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.939574 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:33 crc kubenswrapper[4751]: I1203 14:14:33.939592 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:33Z","lastTransitionTime":"2025-12-03T14:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.042131 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.042172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.042184 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.042202 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.042215 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.145484 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.145535 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.145552 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.145575 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.145594 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.249085 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.249154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.249183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.249215 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.249240 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.351995 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.352052 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.352070 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.352094 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.352114 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.455539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.455605 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.455623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.455651 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.455669 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.558993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.559043 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.559054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.559071 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.559084 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.662046 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.662105 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.662126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.662152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.662170 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.764924 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.764978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.764996 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.765019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.765036 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.867126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.867246 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.867266 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.867292 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.867313 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.970216 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.970304 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.970398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.970431 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:34 crc kubenswrapper[4751]: I1203 14:14:34.970453 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:34Z","lastTransitionTime":"2025-12-03T14:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.073616 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.073685 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.073706 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.073729 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.073746 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.176600 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.176667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.176685 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.176711 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.176730 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.239618 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.239806 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.239844 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.239806929 +0000 UTC m=+146.228162146 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.239953 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.239973 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.240029 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.240070 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.240046185 +0000 UTC m=+146.228401442 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.240100 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.240088276 +0000 UTC m=+146.228443523 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.280044 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.280317 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.280476 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.280595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.280709 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.313786 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.313786 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.313850 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.314251 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.314547 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.314788 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.314929 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.315128 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.340726 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.340916 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.341165 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.341224 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.341115 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.341301 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.341276682 +0000 UTC m=+146.329631929 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.342119 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.342154 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.342169 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:14:35 crc kubenswrapper[4751]: E1203 14:14:35.342232 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.342214177 +0000 UTC m=+146.330569404 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.384394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.384432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.384441 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.384456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.384467 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.487150 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.487223 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.487236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.487256 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.487616 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.592035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.592506 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.592608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.592734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.592922 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.695433 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.695733 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.695823 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.695900 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.695979 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.798630 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.798664 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.798674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.798688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.798700 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.901227 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.901297 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.901309 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.901341 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.901353 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.990093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.990131 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.990140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.990154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:35 crc kubenswrapper[4751]: I1203 14:14:35.990164 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:35Z","lastTransitionTime":"2025-12-03T14:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: E1203 14:14:36.004757 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.009644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.009672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.009680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.009692 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.009702 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: E1203 14:14:36.030889 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.036064 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.036120 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.036134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.036156 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.036172 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: E1203 14:14:36.051638 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.055380 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.055418 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.055432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.055450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.055461 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: E1203 14:14:36.067197 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.071072 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.071302 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.071508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.071674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.071861 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: E1203 14:14:36.085733 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:36 crc kubenswrapper[4751]: E1203 14:14:36.085856 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.087303 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.087380 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.087393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.087415 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.087428 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.190176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.190236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.190257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.190281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.190298 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.292776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.292815 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.292826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.292870 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.292891 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.395885 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.395927 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.395938 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.395953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.395964 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.498517 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.498575 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.498592 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.498617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.498635 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.601542 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.601596 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.601613 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.601637 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.601655 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.704886 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.704919 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.704928 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.704943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.704955 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.807833 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.807874 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.807885 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.807902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.807914 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.910775 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.910842 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.910860 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.910884 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:36 crc kubenswrapper[4751]: I1203 14:14:36.910903 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:36Z","lastTransitionTime":"2025-12-03T14:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.013759 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.013795 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.013804 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.013818 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.013829 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.116739 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.116808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.116826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.116852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.116871 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.219409 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.219472 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.219489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.219511 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.219529 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.313963 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.314529 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.314016 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.314905 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:37 crc kubenswrapper[4751]: E1203 14:14:37.319161 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:37 crc kubenswrapper[4751]: E1203 14:14:37.315103 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:37 crc kubenswrapper[4751]: E1203 14:14:37.319422 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:37 crc kubenswrapper[4751]: E1203 14:14:37.319468 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.322633 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.322680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.322696 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.322718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.322736 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.328540 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.425792 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.426119 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.426228 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.426365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.426486 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.528535 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.528595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.528606 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.528620 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.528630 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.631923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.631963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.631972 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.631987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.631996 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.735112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.735190 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.735217 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.735248 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.735271 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.839119 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.839169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.839183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.839206 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.839223 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.943005 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.943401 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.943570 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.943724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:37 crc kubenswrapper[4751]: I1203 14:14:37.943858 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:37Z","lastTransitionTime":"2025-12-03T14:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.047262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.047366 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.047392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.047424 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.047447 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.150834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.150900 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.150917 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.150942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.150960 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.254212 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.254270 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.254287 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.254308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.254364 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.357595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.357709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.357728 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.357754 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.357773 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.461427 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.461725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.461852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.461965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.462082 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.564662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.564722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.564741 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.564764 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.564785 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.668217 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.668308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.668365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.668394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.668424 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.771404 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.771465 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.771488 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.771521 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.771543 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.874381 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.874473 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.874493 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.874515 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.874531 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.977071 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.977117 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.977134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.977157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:38 crc kubenswrapper[4751]: I1203 14:14:38.977176 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:38Z","lastTransitionTime":"2025-12-03T14:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.080257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.081096 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.081284 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.081484 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.081639 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:39Z","lastTransitionTime":"2025-12-03T14:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.184676 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.184750 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.184773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.184801 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.184825 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:39Z","lastTransitionTime":"2025-12-03T14:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.288727 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.288800 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.288825 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.288852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.288873 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:39Z","lastTransitionTime":"2025-12-03T14:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.313500 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.313613 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:39 crc kubenswrapper[4751]: E1203 14:14:39.313689 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:39 crc kubenswrapper[4751]: E1203 14:14:39.313750 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.313847 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:39 crc kubenswrapper[4751]: E1203 14:14:39.313913 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.313950 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:39 crc kubenswrapper[4751]: E1203 14:14:39.314005 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.392229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.392895 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.393050 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.393183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.393310 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:39Z","lastTransitionTime":"2025-12-03T14:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.496820 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.496926 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.496950 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.496975 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.496994 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:39Z","lastTransitionTime":"2025-12-03T14:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.599889 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.600280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.600515 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.600732 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.600884 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:39Z","lastTransitionTime":"2025-12-03T14:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.708206 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.708568 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.708693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.708805 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.708894 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:39Z","lastTransitionTime":"2025-12-03T14:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.811766 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.812026 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.812090 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.812167 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.812243 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:39Z","lastTransitionTime":"2025-12-03T14:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.915646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.916064 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.916291 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.916572 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:39 crc kubenswrapper[4751]: I1203 14:14:39.916877 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:39Z","lastTransitionTime":"2025-12-03T14:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.020716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.020770 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.020787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.020815 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.020833 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.124435 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.124496 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.124517 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.124541 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.124559 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.228105 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.228170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.228192 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.228222 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.228244 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.330964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.331004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.331013 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.331028 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.331043 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.433657 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.433683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.433691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.433703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.433712 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.535843 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.535873 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.535883 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.535896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.535905 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.638914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.638955 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.638968 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.638985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.638996 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.741509 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.741553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.741567 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.741584 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.741596 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.844424 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.844457 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.844467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.844480 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.844506 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.946762 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.946833 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.946844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.946859 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:40 crc kubenswrapper[4751]: I1203 14:14:40.946868 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:40Z","lastTransitionTime":"2025-12-03T14:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.049894 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.049962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.049981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.050006 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.050025 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.152641 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.152678 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.152709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.152726 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.152744 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.254733 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.254779 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.254798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.254821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.254838 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.313637 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.313700 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:41 crc kubenswrapper[4751]: E1203 14:14:41.313909 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.313947 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.314013 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:41 crc kubenswrapper[4751]: E1203 14:14:41.314497 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:41 crc kubenswrapper[4751]: E1203 14:14:41.314636 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.314681 4751 scope.go:117] "RemoveContainer" containerID="fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f" Dec 03 14:14:41 crc kubenswrapper[4751]: E1203 14:14:41.314767 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:41 crc kubenswrapper[4751]: E1203 14:14:41.314815 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.357922 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.357998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.358021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.358056 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.358080 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.461023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.461059 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.461070 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.461089 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.461100 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.563685 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.563730 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.563740 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.563755 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.563767 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.666446 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.666501 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.666519 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.666542 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.666558 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.768442 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.768483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.768493 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.768509 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.768519 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.871798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.871839 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.871852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.871868 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.871880 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.974117 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.974445 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.974520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.974595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:41 crc kubenswrapper[4751]: I1203 14:14:41.974678 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:41Z","lastTransitionTime":"2025-12-03T14:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.077704 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.077978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.078074 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.078169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.078251 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:42Z","lastTransitionTime":"2025-12-03T14:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.185816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.186074 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.186088 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.186107 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.186121 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:42Z","lastTransitionTime":"2025-12-03T14:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.289904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.289959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.289977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.290001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.290019 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:42Z","lastTransitionTime":"2025-12-03T14:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.392488 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.392547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.392565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.392588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.392607 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:42Z","lastTransitionTime":"2025-12-03T14:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.495448 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.495516 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.495534 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.495561 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.495580 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:42Z","lastTransitionTime":"2025-12-03T14:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.598369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.598443 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.598452 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.598466 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.598476 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:42Z","lastTransitionTime":"2025-12-03T14:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.700946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.700996 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.701011 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.701031 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.701044 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:42Z","lastTransitionTime":"2025-12-03T14:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.803155 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.803204 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.803219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.803240 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.803256 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:42Z","lastTransitionTime":"2025-12-03T14:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.906011 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.906064 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.906075 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.906093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:42 crc kubenswrapper[4751]: I1203 14:14:42.906109 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:42Z","lastTransitionTime":"2025-12-03T14:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.008790 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.008835 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.008854 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.008878 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.008896 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.112358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.112440 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.112458 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.112486 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.112505 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.215710 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.215759 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.215776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.215799 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.215818 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.313395 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.313454 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.313538 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:43 crc kubenswrapper[4751]: E1203 14:14:43.313673 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.313689 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:43 crc kubenswrapper[4751]: E1203 14:14:43.313808 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:43 crc kubenswrapper[4751]: E1203 14:14:43.313832 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:43 crc kubenswrapper[4751]: E1203 14:14:43.313877 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.319354 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.319375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.319382 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.319394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.319405 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.331378 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.345473 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.367070 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.384160 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.407442 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.421713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.421806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.421829 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.421854 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.421906 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.440881 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:29Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:14:29.102796 6794 services_controller.go:434] Service openshift-kube-scheduler-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-scheduler-operator 760c7338-f39e-4136-9d29-d6fccbd607c1 4364 0 2025-02-23 05:12:18 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-kube-scheduler-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:kube-scheduler-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756b94b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.458874 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.474159 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.507692 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.522714 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.525059 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.525142 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.525153 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.525166 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.525175 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.537990 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.556602 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:22Z\\\",\\\"message\\\":\\\"2025-12-03T14:13:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e\\\\n2025-12-03T14:13:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e to /host/opt/cni/bin/\\\\n2025-12-03T14:13:37Z [verbose] multus-daemon started\\\\n2025-12-03T14:13:37Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:14:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.575721 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.596608 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.608757 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.623839 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.629445 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.629498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.629978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.630046 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.630070 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.640995 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d4e051c-5737-4df8-99ff-0a2356b66154\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd748cc363ad0792250139554344aae2d5ec6765a783614b02301d23a050afaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2337f6883ef24084bfd3e6fac8e9bf3ff282d60aceb0ac325aca587cc4953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f2337f6883ef24084bfd3e6fac8e9bf3ff282d60aceb0ac325aca587cc4953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.661961 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.682170 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.734042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.734087 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.734110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.734142 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.734165 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.836922 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.837022 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.837048 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.837081 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.837104 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.940130 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.940208 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.940223 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.940244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:43 crc kubenswrapper[4751]: I1203 14:14:43.940258 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:43Z","lastTransitionTime":"2025-12-03T14:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.043430 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.043476 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.043488 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.043503 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.043514 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.146551 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.146617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.146640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.146670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.146728 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.249880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.249940 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.249953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.249975 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.249990 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.353653 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.353700 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.353713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.353731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.353747 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.456853 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.456913 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.456928 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.456952 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.456970 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.560280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.560315 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.560353 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.560375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.560390 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.662737 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.662805 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.662817 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.662835 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.662849 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.765609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.765674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.765692 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.765716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.765734 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.867479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.867530 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.867541 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.867561 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.867577 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.971032 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.971109 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.971140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.971170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:44 crc kubenswrapper[4751]: I1203 14:14:44.971193 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:44Z","lastTransitionTime":"2025-12-03T14:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.073866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.073902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.073917 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.073935 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.073947 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.176430 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.176478 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.176489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.176508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.176520 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.278388 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.278436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.278449 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.278466 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.278477 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.313804 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.313818 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:45 crc kubenswrapper[4751]: E1203 14:14:45.314008 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.313819 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.313818 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:45 crc kubenswrapper[4751]: E1203 14:14:45.314158 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:45 crc kubenswrapper[4751]: E1203 14:14:45.314260 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:45 crc kubenswrapper[4751]: E1203 14:14:45.314485 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.380798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.380856 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.380876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.380900 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.380918 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.483730 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.483784 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.483805 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.483834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.483855 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.586646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.586707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.586738 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.586756 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.586796 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.691025 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.691276 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.691285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.691300 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.691309 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.793850 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.793910 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.793927 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.793948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.793968 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.896472 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.896500 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.896508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.896520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.896530 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.998920 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.999229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.999306 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.999420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:45 crc kubenswrapper[4751]: I1203 14:14:45.999515 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:45Z","lastTransitionTime":"2025-12-03T14:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.102162 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.102209 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.102219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.102237 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.102249 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.205060 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.205109 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.205118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.205133 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.205142 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.308507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.308581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.308619 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.308657 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.308680 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.314122 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.314181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.314205 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.314231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.314253 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: E1203 14:14:46.328362 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.332315 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.332365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.332375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.332389 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.332399 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: E1203 14:14:46.346373 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.351773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.351808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.351819 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.351833 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.351844 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: E1203 14:14:46.371069 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.377237 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.377287 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.377297 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.377313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.377341 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: E1203 14:14:46.398713 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.402995 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.403049 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.403065 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.403086 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.403100 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: E1203 14:14:46.420848 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:46 crc kubenswrapper[4751]: E1203 14:14:46.421099 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.423107 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.423187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.423201 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.423220 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.423236 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.525863 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.525920 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.525938 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.525954 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.525965 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.628272 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.628313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.628346 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.628365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.628376 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.730899 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.730957 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.730974 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.730996 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.731014 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.833768 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.833917 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.833940 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.833969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.833992 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.936180 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.936211 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.936220 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.936232 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:46 crc kubenswrapper[4751]: I1203 14:14:46.936240 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:46Z","lastTransitionTime":"2025-12-03T14:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.038317 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.038394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.038409 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.038430 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.038447 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.141394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.141436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.141448 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.141464 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.141477 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.244250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.244559 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.244625 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.244699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.244758 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.313548 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.313600 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.313657 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.313554 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:47 crc kubenswrapper[4751]: E1203 14:14:47.313697 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:47 crc kubenswrapper[4751]: E1203 14:14:47.313791 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:47 crc kubenswrapper[4751]: E1203 14:14:47.313959 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:47 crc kubenswrapper[4751]: E1203 14:14:47.313973 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.347250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.347294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.347305 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.347367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.347382 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.449918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.449962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.449971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.449984 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.449996 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.552703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.552761 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.552777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.552806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.552826 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.655070 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.655140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.655159 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.655188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.655209 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.758038 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.758152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.758175 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.758244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.758265 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.862087 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.862161 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.862184 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.862214 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.862239 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.966848 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.966897 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.966931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.966965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:47 crc kubenswrapper[4751]: I1203 14:14:47.966976 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:47Z","lastTransitionTime":"2025-12-03T14:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.069400 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.069460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.069470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.069484 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.069510 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.171236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.171279 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.171287 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.171302 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.171313 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.274119 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.274164 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.274175 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.274187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.274196 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.377539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.377614 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.377646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.377672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.377690 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.481127 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.481172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.481188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.481212 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.481230 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.583674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.583751 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.583773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.583803 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.583821 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.686547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.686606 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.686623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.686648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.686665 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.788806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.788869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.788886 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.788913 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.788931 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.891734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.891782 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.891794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.891809 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.891818 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.994424 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.994479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.994492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.994510 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:48 crc kubenswrapper[4751]: I1203 14:14:48.994523 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:48Z","lastTransitionTime":"2025-12-03T14:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.097894 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.097949 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.097961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.097982 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.097994 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:49Z","lastTransitionTime":"2025-12-03T14:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.201726 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.201767 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.201775 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.201793 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.201802 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:49Z","lastTransitionTime":"2025-12-03T14:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.303925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.303965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.303973 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.303987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.303997 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:49Z","lastTransitionTime":"2025-12-03T14:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.313666 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.313723 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.313785 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:49 crc kubenswrapper[4751]: E1203 14:14:49.313823 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.313751 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:49 crc kubenswrapper[4751]: E1203 14:14:49.313978 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:49 crc kubenswrapper[4751]: E1203 14:14:49.314074 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:49 crc kubenswrapper[4751]: E1203 14:14:49.314162 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.407018 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.407113 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.407136 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.407184 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.407211 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:49Z","lastTransitionTime":"2025-12-03T14:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.510294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.510413 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.510507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.510542 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.510596 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:49Z","lastTransitionTime":"2025-12-03T14:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.613500 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.613864 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.613933 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.613998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.614067 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:49Z","lastTransitionTime":"2025-12-03T14:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.716948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.717263 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.717392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.717485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.717569 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:49Z","lastTransitionTime":"2025-12-03T14:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.820526 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.820813 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.820906 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.821118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.821219 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:49Z","lastTransitionTime":"2025-12-03T14:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.925405 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.925487 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.925507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.925545 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:49 crc kubenswrapper[4751]: I1203 14:14:49.925572 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:49Z","lastTransitionTime":"2025-12-03T14:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.029388 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.029758 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.029902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.030029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.030143 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.133507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.133858 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.133888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.133914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.133932 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.237418 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.237775 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.237923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.238067 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.238218 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.341661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.341717 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.341734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.341759 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.341785 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.444723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.444781 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.444799 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.444825 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.444844 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.547761 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.547798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.547810 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.547825 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.547835 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.650668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.650744 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.650762 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.650788 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.650806 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.754588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.755068 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.755240 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.755438 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.755600 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.858609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.859049 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.859275 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.859605 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.859829 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.964012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.964068 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.964117 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.964142 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:50 crc kubenswrapper[4751]: I1203 14:14:50.964159 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:50Z","lastTransitionTime":"2025-12-03T14:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.065970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.066000 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.066008 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.066022 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.066031 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.168294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.168358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.168372 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.168389 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.168399 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.270121 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.270453 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.270547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.270621 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.270690 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.313633 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:51 crc kubenswrapper[4751]: E1203 14:14:51.313764 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.313807 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.313807 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:51 crc kubenswrapper[4751]: E1203 14:14:51.313971 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:51 crc kubenswrapper[4751]: E1203 14:14:51.314079 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.314088 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:51 crc kubenswrapper[4751]: E1203 14:14:51.314152 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.372956 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.373182 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.373281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.373400 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.373540 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.476600 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.476637 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.476649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.476667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.476680 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.578404 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.578440 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.578451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.578467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.578477 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.680394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.680665 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.680756 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.680830 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.680892 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.783141 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.783176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.783185 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.783200 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.783210 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.885798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.885859 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.885872 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.885896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.885909 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.989544 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.989873 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.990019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.990176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:51 crc kubenswrapper[4751]: I1203 14:14:51.990359 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:51Z","lastTransitionTime":"2025-12-03T14:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.093045 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.093083 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.093096 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.093112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.093124 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:52Z","lastTransitionTime":"2025-12-03T14:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.195879 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.195931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.195944 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.195961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.195973 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:52Z","lastTransitionTime":"2025-12-03T14:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.298560 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.298806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.298873 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.298945 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.299017 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:52Z","lastTransitionTime":"2025-12-03T14:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.313912 4751 scope.go:117] "RemoveContainer" containerID="fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f" Dec 03 14:14:52 crc kubenswrapper[4751]: E1203 14:14:52.314177 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.401680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.401718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.401726 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.401740 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.401751 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:52Z","lastTransitionTime":"2025-12-03T14:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.504316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.504374 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.504385 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.504401 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.504442 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:52Z","lastTransitionTime":"2025-12-03T14:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.606554 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.606595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.606656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.606676 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.606689 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:52Z","lastTransitionTime":"2025-12-03T14:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.708807 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.709052 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.709116 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.709184 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.709240 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:52Z","lastTransitionTime":"2025-12-03T14:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.810939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.810978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.810988 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.811006 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.811021 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:52Z","lastTransitionTime":"2025-12-03T14:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.912688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.912729 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.912741 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.912756 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:52 crc kubenswrapper[4751]: I1203 14:14:52.912768 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:52Z","lastTransitionTime":"2025-12-03T14:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.014916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.014953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.014963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.014977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.014987 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.117872 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.118172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.118360 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.118520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.118652 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.221192 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.221231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.221241 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.221256 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.221266 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.313460 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.313561 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:53 crc kubenswrapper[4751]: E1203 14:14:53.313600 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.313630 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.313641 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:53 crc kubenswrapper[4751]: E1203 14:14:53.313751 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:53 crc kubenswrapper[4751]: E1203 14:14:53.313810 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:53 crc kubenswrapper[4751]: E1203 14:14:53.313870 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.324923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.324950 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.324958 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.324970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.324980 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.325728 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea448b5e-d751-42cd-ba62-94bfe104c8dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86c32c58671beb01e05f0fc2e0bb0dd3852c731d547e7e00f68d996e4b1c82b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f517a25f354bfdf8d4ca6a11b1fc689e25630c8f82d80f869ee30be87335091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c995df1ca20af2a26b10cc901a61d16a1530c1fd574fb463cc26eb8907ce0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd289137933c564b4650302ce3a8be46d6d08aa5fedbd5c4d6b54e6b2cef712\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.337356 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"264eec0e-c57b-443b-820c-eb8867f56888\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ddb57854b5a0876cf9bc759141a15ba714e2f49537dc73217894752d32efadc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57c0b328894c58557c46b3e0cbb10076d40afd2cc48c2fa19af3f564b91c479\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94cee54129bac775d4a1d435aa1f1664b2262ea52785a04c2b88a1f886508a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.366550 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bb502d2-f15a-403f-b73a-743416ed30b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034e057ee5a5e5a3c8b3db8f17c7da46887d9c4a13e44df9b71c5130178a5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12eecd34072106bcbbfa814186730241345f230063d4ba2270fc74c07a53d0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec16aec68d9f8966d18cd10a1daaecd2a4a547c9afaacd69b729a814baa3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b48869469e879f1cc2846361636c9f39f3a58ecd5878b6fcfc0dd8107d6fd7a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1130a91812f231697a9cc8c87f72ba94f638d2ccb01781bc44da671a500c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://038786f50987a6909d058d19cde67d7197796fe014fcd1932bf492352e8b0ba3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d664fd5562a25c88b822c66edc186b6af2b4f94c708ca98671463286b2bac7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44ffcc066cdf426ec3766e493c38e56fc490701831bab6673ff354219eb867b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.377513 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g8hvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5059484e-1748-4e33-89c0-3d5e57b97489\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b309bf7c4575c1b9f8810f38f456cc99292cc126b7880be8f6fc773552fa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ljv7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g8hvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.387977 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmjzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba1da9df-c8bd-41d0-bc00-09688e27984e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3270918b66db602efde3caa3a7f6a063b172e8fc26d271bb5db3684eed7b1a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hcd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmjzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.399771 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98mjq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a216adb-632d-4134-8c61-61fe6b8c5f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:22Z\\\",\\\"message\\\":\\\"2025-12-03T14:13:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e\\\\n2025-12-03T14:13:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd7656b5-d26d-4b39-ae35-3c093468466e to /host/opt/cni/bin/\\\\n2025-12-03T14:13:37Z [verbose] multus-daemon started\\\\n2025-12-03T14:13:37Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:14:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w565t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98mjq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.413585 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.427275 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t8q27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7487dd7-cc8c-4b6b-8399-eb0e5cbdddc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0746b9b3267a0ad85440e9e392d48037c8522981bb736acffabca9661e0a28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb3270d0a480639c080d66355199c973df596ac24e4d1630a28e942d9208064\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3317353db24ea56cc1341e61a11dd230ae269aa4f50654f9183b9430a8d7d74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a4e57d7d6e29d691975a419dfa830bba7eb106deef63895428551cda6b12873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916dc5e95f77e5b0a12a54dd39cf8a7ff3dd01ea52017a9f580393e0db7e2a2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392625646d09a9df0b978e9b966004c56097dfdf8dcff619e5a05c7dc3ff77e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c73e8fc51b3340666c6ef4cdb2e863414b9fa2fe18b3991196be4f75637f63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gzr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t8q27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.427926 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.427968 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.427982 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.428001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.428015 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.439288 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0557f3d-915e-4c1c-b9a0-de7b3b8dabea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://243f7a2a090c20b5f43fd09961193c68689cbdf90a21f23683146f39a61a6fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf4cfa428c0e9f95347500b09c84e541ed66924885a56a15ed20612fbf2117bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrsqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dp4lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.450501 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fb8744-4cb9-4138-8310-c02f7c6a2941\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sgzgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zgqdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.462349 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d4e051c-5737-4df8-99ff-0a2356b66154\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd748cc363ad0792250139554344aae2d5ec6765a783614b02301d23a050afaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2337f6883ef24084bfd3e6fac8e9bf3ff282d60aceb0ac325aca587cc4953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f2337f6883ef24084bfd3e6fac8e9bf3ff282d60aceb0ac325aca587cc4953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.476149 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.489909 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d98bce915a090319c9062a43902d1c3d6aeb5563e9ec68ac45d4cf4dd7041ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c352fa71ab8777d833803aa2b75a03ac52831c7c2ecc1397c264b74225aeb22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.504601 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"088623b3-b683-467a-88db-985af50fb182\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:13:30.964220 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:13:30.964453 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:13:30.965583 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1313986765/tls.crt::/tmp/serving-cert-1313986765/tls.key\\\\\\\"\\\\nI1203 14:13:31.584130 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:13:31.588548 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:13:31.588571 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:13:31.588600 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:13:31.588608 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:13:31.593947 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:13:31.594403 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594455 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:13:31.594501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:13:31.594548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:13:31.594586 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:13:31.594618 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:13:31.593961 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:13:31.595347 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.520541 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.532056 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.532115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.532132 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.532154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.532175 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.535168 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8312fd7afa7e74c1fc37fd8b6395a3a678f91d7f5c894a1811e67ea439fd0916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.548000 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace88020aab8c89faa2c45743f10c8261c52eaf5ad54bedce7fd91bebfb96d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.562874 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"385620eb-744d-423e-b02b-1274f3075689\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba5cc4980efb159800133ba37759499733440efa34259d17c201a985e11bf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm85t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djf67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.592922 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5526cae-f2a4-4094-a08a-fbf69cb11593\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:13:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:14:29Z\\\",\\\"message\\\":\\\"to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:29Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:14:29.102796 6794 services_controller.go:434] Service openshift-kube-scheduler-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-scheduler-operator 760c7338-f39e-4136-9d29-d6fccbd607c1 4364 0 2025-02-23 05:12:18 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-kube-scheduler-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:kube-scheduler-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00756b94b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:14:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:13:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:13:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbbql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:13:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kwchh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.635836 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.635891 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.635908 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.635933 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.635954 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.738629 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.738663 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.738672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.738685 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.738696 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.841155 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.841493 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.841612 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.841717 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.841813 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.944630 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.944669 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.944681 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.944698 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:53 crc kubenswrapper[4751]: I1203 14:14:53.944709 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:53Z","lastTransitionTime":"2025-12-03T14:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.045995 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:54 crc kubenswrapper[4751]: E1203 14:14:54.046166 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:14:54 crc kubenswrapper[4751]: E1203 14:14:54.046226 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs podName:45fb8744-4cb9-4138-8310-c02f7c6a2941 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:58.046211909 +0000 UTC m=+165.034567126 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs") pod "network-metrics-daemon-zgqdp" (UID: "45fb8744-4cb9-4138-8310-c02f7c6a2941") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.047835 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.047869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.047877 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.047891 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.047899 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.150050 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.150089 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.150100 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.150112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.150121 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.251932 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.252237 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.252366 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.252450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.252530 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.354841 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.354899 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.354917 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.354939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.354957 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.457514 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.457571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.457588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.457610 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.457628 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.559852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.559886 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.559916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.559959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.559968 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.662530 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.662585 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.662595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.662610 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.662620 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.765219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.765259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.765268 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.765283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.765293 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.867682 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.867727 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.867760 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.867777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.867788 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.970029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.970062 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.970070 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.970084 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:54 crc kubenswrapper[4751]: I1203 14:14:54.970093 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:54Z","lastTransitionTime":"2025-12-03T14:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.071868 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.071915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.071929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.071951 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.071966 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.174269 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.174315 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.174336 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.174351 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.174361 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.276959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.277019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.277035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.277077 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.277091 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.313963 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.314059 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.314005 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.313966 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:55 crc kubenswrapper[4751]: E1203 14:14:55.314155 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:55 crc kubenswrapper[4751]: E1203 14:14:55.314284 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:55 crc kubenswrapper[4751]: E1203 14:14:55.314440 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:55 crc kubenswrapper[4751]: E1203 14:14:55.314586 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.379454 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.379487 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.379515 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.379527 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.379535 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.481427 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.481491 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.481500 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.481513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.481523 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.583404 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.583441 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.583455 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.583471 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.583483 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.686233 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.686274 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.686283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.686304 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.686315 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.789343 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.789383 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.789395 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.789427 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.789438 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.891418 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.891661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.891785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.891887 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.891971 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.993970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.994019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.994032 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.994050 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:55 crc kubenswrapper[4751]: I1203 14:14:55.994063 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:55Z","lastTransitionTime":"2025-12-03T14:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.096595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.096631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.096640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.096654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.096666 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.199431 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.199684 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.199847 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.199974 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.200077 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.302812 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.303057 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.303142 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.303211 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.303275 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.405843 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.405902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.405912 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.405925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.405954 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.508410 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.508445 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.508456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.508471 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.508483 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.605883 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.605933 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.605942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.605959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.605969 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: E1203 14:14:56.620056 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.623744 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.623803 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.623815 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.623833 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.623844 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: E1203 14:14:56.635447 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.638573 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.638624 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.638637 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.638656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.638666 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: E1203 14:14:56.653046 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.656432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.656580 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.656646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.656716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.656780 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: E1203 14:14:56.670780 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.675344 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.675394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.675405 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.675425 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.675438 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: E1203 14:14:56.687667 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:14:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bd661df5-edc0-463f-8631-fab82404a306\\\",\\\"systemUUID\\\":\\\"b448da91-3150-4278-a353-292ec92ffaef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:14:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:14:56 crc kubenswrapper[4751]: E1203 14:14:56.687821 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.689686 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.689747 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.689757 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.689773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.689783 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.792314 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.792398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.792415 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.792438 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.792453 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.894433 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.894479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.894489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.894503 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.894514 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.996558 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.996598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.996632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.996649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:56 crc kubenswrapper[4751]: I1203 14:14:56.996662 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:56Z","lastTransitionTime":"2025-12-03T14:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.099282 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.099321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.099347 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.099364 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.099373 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:57Z","lastTransitionTime":"2025-12-03T14:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.201794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.201858 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.201874 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.201891 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.201905 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:57Z","lastTransitionTime":"2025-12-03T14:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.305218 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.305258 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.305272 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.305288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.305300 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:57Z","lastTransitionTime":"2025-12-03T14:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.313123 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:57 crc kubenswrapper[4751]: E1203 14:14:57.313273 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.313627 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.313778 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.313858 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:57 crc kubenswrapper[4751]: E1203 14:14:57.314100 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:57 crc kubenswrapper[4751]: E1203 14:14:57.314183 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:57 crc kubenswrapper[4751]: E1203 14:14:57.314278 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.407566 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.407638 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.407663 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.407691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.407715 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:57Z","lastTransitionTime":"2025-12-03T14:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.509905 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.509942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.509950 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.509964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.509973 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:57Z","lastTransitionTime":"2025-12-03T14:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.612458 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.612997 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.613099 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.613215 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.613287 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:57Z","lastTransitionTime":"2025-12-03T14:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.716143 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.716183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.716197 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.716219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.716234 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:57Z","lastTransitionTime":"2025-12-03T14:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.818269 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.818300 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.818308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.818321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.818344 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:57Z","lastTransitionTime":"2025-12-03T14:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.921228 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.921279 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.921294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.921317 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:57 crc kubenswrapper[4751]: I1203 14:14:57.921363 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:57Z","lastTransitionTime":"2025-12-03T14:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.024215 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.024250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.024257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.024270 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.024279 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.126553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.126588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.126596 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.126609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.126619 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.229547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.229955 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.230054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.230184 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.230283 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.336293 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.336364 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.336377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.336396 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.336410 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.438729 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.438764 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.438776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.438790 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.438802 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.540921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.540966 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.540975 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.540987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.540997 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.643930 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.644576 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.644678 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.644771 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.644868 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.747866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.748140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.748247 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.748316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.748416 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.850749 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.850794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.850803 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.850819 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.850829 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.953731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.954040 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.954139 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.954240 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:58 crc kubenswrapper[4751]: I1203 14:14:58.954354 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:58Z","lastTransitionTime":"2025-12-03T14:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.056940 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.057018 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.057027 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.057087 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.057096 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.159546 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.159600 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.159617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.159640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.159655 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.262246 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.262341 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.262358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.262379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.262388 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.312999 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.313011 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:14:59 crc kubenswrapper[4751]: E1203 14:14:59.313231 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:14:59 crc kubenswrapper[4751]: E1203 14:14:59.313319 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.313643 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.313757 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:14:59 crc kubenswrapper[4751]: E1203 14:14:59.313926 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:14:59 crc kubenswrapper[4751]: E1203 14:14:59.314118 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.365623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.365691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.365703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.365724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.365736 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.468362 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.468669 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.468766 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.468850 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.468942 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.571458 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.571497 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.571528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.571558 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.571570 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.674012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.674056 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.674069 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.674085 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.674123 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.776169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.776210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.776219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.776236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.776247 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.878955 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.878987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.878996 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.879009 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.879019 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.980799 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.980842 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.980855 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.980872 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:14:59 crc kubenswrapper[4751]: I1203 14:14:59.980886 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:14:59Z","lastTransitionTime":"2025-12-03T14:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.083121 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.083160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.083171 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.083186 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.083197 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:00Z","lastTransitionTime":"2025-12-03T14:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.185282 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.185350 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.185360 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.185375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.185388 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:00Z","lastTransitionTime":"2025-12-03T14:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.287869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.287908 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.287931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.287945 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.287956 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:00Z","lastTransitionTime":"2025-12-03T14:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.390408 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.390456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.390490 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.390508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.390517 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:00Z","lastTransitionTime":"2025-12-03T14:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.493170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.493220 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.493231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.493249 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.493260 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:00Z","lastTransitionTime":"2025-12-03T14:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.596102 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.596471 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.597081 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.597201 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.597293 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:00Z","lastTransitionTime":"2025-12-03T14:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.699881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.699921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.699930 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.699977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.700012 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:00Z","lastTransitionTime":"2025-12-03T14:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.802614 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.802659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.802674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.802692 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.802702 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:00Z","lastTransitionTime":"2025-12-03T14:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.905352 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.905396 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.905408 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.905423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:00 crc kubenswrapper[4751]: I1203 14:15:00.905437 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:00Z","lastTransitionTime":"2025-12-03T14:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.007470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.007508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.007516 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.007530 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.007539 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.109964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.110015 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.110030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.110049 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.110062 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.212848 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.212908 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.212926 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.212949 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.212964 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.313829 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.313829 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.313988 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:01 crc kubenswrapper[4751]: E1203 14:15:01.314067 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:01 crc kubenswrapper[4751]: E1203 14:15:01.313991 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.314132 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:01 crc kubenswrapper[4751]: E1203 14:15:01.314246 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:01 crc kubenswrapper[4751]: E1203 14:15:01.314576 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.315417 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.315452 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.315464 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.315480 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.315492 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.418568 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.418615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.418626 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.418644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.418656 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.520716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.521006 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.521139 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.521299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.521451 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.625037 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.625083 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.625095 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.625112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.625130 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.728763 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.728819 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.728834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.728854 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.728869 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.831174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.831220 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.831234 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.831253 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.831267 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.935157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.935236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.935247 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.935284 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:01 crc kubenswrapper[4751]: I1203 14:15:01.935297 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:01Z","lastTransitionTime":"2025-12-03T14:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.038280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.038318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.038345 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.038361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.038372 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.140896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.140933 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.140944 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.140960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.140972 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.244062 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.244101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.244112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.244128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.244140 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.346645 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.346691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.346700 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.346719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.346730 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.450554 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.450642 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.450671 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.450716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.450758 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.553927 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.553970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.553980 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.553996 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.554007 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.657344 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.657382 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.657393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.657408 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.657421 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.761075 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.761157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.761178 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.761231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.761259 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.864025 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.864086 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.864104 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.864128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.864149 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.967251 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.967644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.967797 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.967934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:02 crc kubenswrapper[4751]: I1203 14:15:02.968072 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:02Z","lastTransitionTime":"2025-12-03T14:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.070571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.070847 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.070943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.071076 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.071154 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.174094 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.174160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.174178 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.174721 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.174764 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.277604 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.277641 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.277650 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.277671 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.277679 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.313749 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.313829 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.314144 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.314162 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:03 crc kubenswrapper[4751]: E1203 14:15:03.314500 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:03 crc kubenswrapper[4751]: E1203 14:15:03.314633 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:03 crc kubenswrapper[4751]: E1203 14:15:03.314778 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:03 crc kubenswrapper[4751]: E1203 14:15:03.314926 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.362384 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t8q27" podStartSLOduration=88.362290382 podStartE2EDuration="1m28.362290382s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.362259541 +0000 UTC m=+110.350614768" watchObservedRunningTime="2025-12-03 14:15:03.362290382 +0000 UTC m=+110.350645639" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.380064 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.380369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.380451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.380553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.380649 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.383171 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dp4lz" podStartSLOduration=87.383152489 podStartE2EDuration="1m27.383152489s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.383079867 +0000 UTC m=+110.371435084" watchObservedRunningTime="2025-12-03 14:15:03.383152489 +0000 UTC m=+110.371507706" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.406939 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.406921213 podStartE2EDuration="26.406921213s" podCreationTimestamp="2025-12-03 14:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.406480692 +0000 UTC m=+110.394835929" watchObservedRunningTime="2025-12-03 14:15:03.406921213 +0000 UTC m=+110.395276430" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.483928 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.483963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.483974 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.483990 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.484001 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.513091 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podStartSLOduration=88.51307055 podStartE2EDuration="1m28.51307055s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.459586416 +0000 UTC m=+110.447941643" watchObservedRunningTime="2025-12-03 14:15:03.51307055 +0000 UTC m=+110.501425767" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.550214 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=91.550192384 podStartE2EDuration="1m31.550192384s" podCreationTimestamp="2025-12-03 14:13:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.538002214 +0000 UTC m=+110.526357431" watchObservedRunningTime="2025-12-03 14:15:03.550192384 +0000 UTC m=+110.538547601" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.583770 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hmjzc" podStartSLOduration=88.583753365 podStartE2EDuration="1m28.583753365s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.583480838 +0000 UTC m=+110.571836055" watchObservedRunningTime="2025-12-03 14:15:03.583753365 +0000 UTC m=+110.572108572" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.585659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.585786 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.585947 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.586114 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.586262 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.615624 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-98mjq" podStartSLOduration=88.615601681 podStartE2EDuration="1m28.615601681s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.597686161 +0000 UTC m=+110.586041398" watchObservedRunningTime="2025-12-03 14:15:03.615601681 +0000 UTC m=+110.603956908" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.615767 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=60.615762116 podStartE2EDuration="1m0.615762116s" podCreationTimestamp="2025-12-03 14:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.615006296 +0000 UTC m=+110.603361513" watchObservedRunningTime="2025-12-03 14:15:03.615762116 +0000 UTC m=+110.604117343" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.635496 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=85.635479383 podStartE2EDuration="1m25.635479383s" podCreationTimestamp="2025-12-03 14:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.633419289 +0000 UTC m=+110.621774506" watchObservedRunningTime="2025-12-03 14:15:03.635479383 +0000 UTC m=+110.623834600" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.658206 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=88.658185809 podStartE2EDuration="1m28.658185809s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.65708787 +0000 UTC m=+110.645443087" watchObservedRunningTime="2025-12-03 14:15:03.658185809 +0000 UTC m=+110.646541026" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.667709 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g8hvv" podStartSLOduration=88.667690249 podStartE2EDuration="1m28.667690249s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:03.667085993 +0000 UTC m=+110.655441220" watchObservedRunningTime="2025-12-03 14:15:03.667690249 +0000 UTC m=+110.656045466" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.689304 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.689369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.689383 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.689401 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.689413 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.792357 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.792850 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.792918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.792989 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.793060 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.895371 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.895410 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.895419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.895432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.895442 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.998242 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.998285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.998294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.998308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:03 crc kubenswrapper[4751]: I1203 14:15:03.998319 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:03Z","lastTransitionTime":"2025-12-03T14:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.100492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.100528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.100536 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.100549 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.100559 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:04Z","lastTransitionTime":"2025-12-03T14:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.202922 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.202951 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.202959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.202971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.202980 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:04Z","lastTransitionTime":"2025-12-03T14:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.305709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.305757 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.305768 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.305784 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.305796 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:04Z","lastTransitionTime":"2025-12-03T14:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.313729 4751 scope.go:117] "RemoveContainer" containerID="fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f" Dec 03 14:15:04 crc kubenswrapper[4751]: E1203 14:15:04.313878 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kwchh_openshift-ovn-kubernetes(a5526cae-f2a4-4094-a08a-fbf69cb11593)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.409116 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.409168 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.409182 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.409199 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.409213 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:04Z","lastTransitionTime":"2025-12-03T14:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.511158 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.511235 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.511260 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.511288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.511311 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:04Z","lastTransitionTime":"2025-12-03T14:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.613943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.614287 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.614471 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.614616 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.614707 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:04Z","lastTransitionTime":"2025-12-03T14:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.718023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.718086 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.718098 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.718114 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.718125 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:04Z","lastTransitionTime":"2025-12-03T14:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.820184 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.820708 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.820776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.820848 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.820905 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:04Z","lastTransitionTime":"2025-12-03T14:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.923289 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.923406 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.923468 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.923499 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:04 crc kubenswrapper[4751]: I1203 14:15:04.923522 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:04Z","lastTransitionTime":"2025-12-03T14:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.026069 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.026101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.026111 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.026125 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.026135 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.129129 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.129176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.129190 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.129207 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.129221 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.232014 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.232102 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.232118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.232139 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.232159 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.313188 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.313240 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.313265 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.313219 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:05 crc kubenswrapper[4751]: E1203 14:15:05.313378 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:05 crc kubenswrapper[4751]: E1203 14:15:05.313465 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:05 crc kubenswrapper[4751]: E1203 14:15:05.313529 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:05 crc kubenswrapper[4751]: E1203 14:15:05.313585 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.334106 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.334168 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.334179 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.334195 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.334208 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.436283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.436312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.436320 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.436356 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.436372 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.539106 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.539154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.539169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.539188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.539204 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.642216 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.642253 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.642263 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.642279 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.642289 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.744695 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.744753 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.744763 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.744776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.744785 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.847290 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.847356 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.847369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.847384 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.847395 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.949787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.949830 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.949838 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.949853 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:05 crc kubenswrapper[4751]: I1203 14:15:05.949864 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:05Z","lastTransitionTime":"2025-12-03T14:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.052268 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.052318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.052359 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.052380 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.052397 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.154318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.154381 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.154392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.154409 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.154420 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.256955 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.257040 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.257054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.257071 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.257082 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.359032 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.359088 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.359104 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.359128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.359144 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.461674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.461710 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.461720 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.461733 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.461743 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.564763 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.564814 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.564823 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.564845 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.564857 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.667436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.667799 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.667810 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.667824 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.667852 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.770157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.770225 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.770243 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.770271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.770290 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.872618 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.872673 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.872689 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.872709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.872724 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.883248 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.883285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.883296 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.883310 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.883338 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:15:06Z","lastTransitionTime":"2025-12-03T14:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.927518 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd"] Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.927897 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.930382 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.930469 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.930549 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 14:15:06 crc kubenswrapper[4751]: I1203 14:15:06.931255 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.077974 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98473df1-27ea-4ea4-a253-d279b742e355-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.078029 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/98473df1-27ea-4ea4-a253-d279b742e355-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.078051 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/98473df1-27ea-4ea4-a253-d279b742e355-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.078083 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98473df1-27ea-4ea4-a253-d279b742e355-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.078102 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/98473df1-27ea-4ea4-a253-d279b742e355-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.179541 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/98473df1-27ea-4ea4-a253-d279b742e355-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.179589 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/98473df1-27ea-4ea4-a253-d279b742e355-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.179631 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98473df1-27ea-4ea4-a253-d279b742e355-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.179654 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/98473df1-27ea-4ea4-a253-d279b742e355-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.179691 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98473df1-27ea-4ea4-a253-d279b742e355-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.179833 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/98473df1-27ea-4ea4-a253-d279b742e355-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.179837 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/98473df1-27ea-4ea4-a253-d279b742e355-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.180654 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98473df1-27ea-4ea4-a253-d279b742e355-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.184937 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98473df1-27ea-4ea4-a253-d279b742e355-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.198994 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/98473df1-27ea-4ea4-a253-d279b742e355-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9xxgd\" (UID: \"98473df1-27ea-4ea4-a253-d279b742e355\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.241946 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.312997 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.313018 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.313056 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:07 crc kubenswrapper[4751]: E1203 14:15:07.313128 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.313265 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:07 crc kubenswrapper[4751]: E1203 14:15:07.313391 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:07 crc kubenswrapper[4751]: E1203 14:15:07.313502 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:07 crc kubenswrapper[4751]: E1203 14:15:07.313605 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.846427 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" event={"ID":"98473df1-27ea-4ea4-a253-d279b742e355","Type":"ContainerStarted","Data":"a375ede6e833b0ce4ac1b73c6565a0919e0ec7f3d21b39015ba4d4fafd8ed549"} Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.846482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" event={"ID":"98473df1-27ea-4ea4-a253-d279b742e355","Type":"ContainerStarted","Data":"34f3bf6f666fbaa0f1b38174007a9519baf2e8a9bef47e7b8eb5fc5266ba6db8"} Dec 03 14:15:07 crc kubenswrapper[4751]: I1203 14:15:07.859786 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9xxgd" podStartSLOduration=91.859765504 podStartE2EDuration="1m31.859765504s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:07.858575742 +0000 UTC m=+114.846930969" watchObservedRunningTime="2025-12-03 14:15:07.859765504 +0000 UTC m=+114.848120721" Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.313208 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.313652 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.313664 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:09 crc kubenswrapper[4751]: E1203 14:15:09.313755 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.314028 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:09 crc kubenswrapper[4751]: E1203 14:15:09.314124 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:09 crc kubenswrapper[4751]: E1203 14:15:09.314205 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:09 crc kubenswrapper[4751]: E1203 14:15:09.314305 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.854176 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98mjq_6a216adb-632d-4134-8c61-61fe6b8c5f71/kube-multus/1.log" Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.854880 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98mjq_6a216adb-632d-4134-8c61-61fe6b8c5f71/kube-multus/0.log" Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.854977 4751 generic.go:334] "Generic (PLEG): container finished" podID="6a216adb-632d-4134-8c61-61fe6b8c5f71" containerID="f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801" exitCode=1 Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.855025 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98mjq" event={"ID":"6a216adb-632d-4134-8c61-61fe6b8c5f71","Type":"ContainerDied","Data":"f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801"} Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.855075 4751 scope.go:117] "RemoveContainer" containerID="32f1ac76cc7a6f8dfad24073aa9b0337bff1cfdd2e290f585409ba2a69fb9815" Dec 03 14:15:09 crc kubenswrapper[4751]: I1203 14:15:09.855769 4751 scope.go:117] "RemoveContainer" containerID="f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801" Dec 03 14:15:09 crc kubenswrapper[4751]: E1203 14:15:09.856100 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-98mjq_openshift-multus(6a216adb-632d-4134-8c61-61fe6b8c5f71)\"" pod="openshift-multus/multus-98mjq" podUID="6a216adb-632d-4134-8c61-61fe6b8c5f71" Dec 03 14:15:10 crc kubenswrapper[4751]: I1203 14:15:10.859741 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98mjq_6a216adb-632d-4134-8c61-61fe6b8c5f71/kube-multus/1.log" Dec 03 14:15:11 crc kubenswrapper[4751]: I1203 14:15:11.313565 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:11 crc kubenswrapper[4751]: I1203 14:15:11.313590 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:11 crc kubenswrapper[4751]: I1203 14:15:11.313631 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:11 crc kubenswrapper[4751]: E1203 14:15:11.313681 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:11 crc kubenswrapper[4751]: E1203 14:15:11.313772 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:11 crc kubenswrapper[4751]: E1203 14:15:11.313940 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:11 crc kubenswrapper[4751]: I1203 14:15:11.313996 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:11 crc kubenswrapper[4751]: E1203 14:15:11.314047 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:13 crc kubenswrapper[4751]: I1203 14:15:13.313856 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:13 crc kubenswrapper[4751]: I1203 14:15:13.313860 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:13 crc kubenswrapper[4751]: I1203 14:15:13.313909 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:13 crc kubenswrapper[4751]: I1203 14:15:13.313961 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:13 crc kubenswrapper[4751]: E1203 14:15:13.314771 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:13 crc kubenswrapper[4751]: E1203 14:15:13.314899 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:13 crc kubenswrapper[4751]: E1203 14:15:13.314966 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:13 crc kubenswrapper[4751]: E1203 14:15:13.315059 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:13 crc kubenswrapper[4751]: E1203 14:15:13.332736 4751 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 14:15:13 crc kubenswrapper[4751]: E1203 14:15:13.405013 4751 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:15:15 crc kubenswrapper[4751]: I1203 14:15:15.313863 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:15 crc kubenswrapper[4751]: I1203 14:15:15.314033 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:15 crc kubenswrapper[4751]: I1203 14:15:15.314107 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:15 crc kubenswrapper[4751]: E1203 14:15:15.314101 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:15 crc kubenswrapper[4751]: I1203 14:15:15.314195 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:15 crc kubenswrapper[4751]: E1203 14:15:15.314195 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:15 crc kubenswrapper[4751]: E1203 14:15:15.314425 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:15 crc kubenswrapper[4751]: E1203 14:15:15.314608 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:17 crc kubenswrapper[4751]: I1203 14:15:17.313436 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:17 crc kubenswrapper[4751]: E1203 14:15:17.313557 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:17 crc kubenswrapper[4751]: I1203 14:15:17.313708 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:17 crc kubenswrapper[4751]: E1203 14:15:17.313751 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:17 crc kubenswrapper[4751]: I1203 14:15:17.313841 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:17 crc kubenswrapper[4751]: E1203 14:15:17.313882 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:17 crc kubenswrapper[4751]: I1203 14:15:17.313966 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:17 crc kubenswrapper[4751]: E1203 14:15:17.314013 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:18 crc kubenswrapper[4751]: I1203 14:15:18.314269 4751 scope.go:117] "RemoveContainer" containerID="fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f" Dec 03 14:15:18 crc kubenswrapper[4751]: E1203 14:15:18.406026 4751 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:15:18 crc kubenswrapper[4751]: I1203 14:15:18.883723 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/3.log" Dec 03 14:15:18 crc kubenswrapper[4751]: I1203 14:15:18.885892 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerStarted","Data":"21804d9d39fcedb6c0cbabe7c8de6d85feba3d2047448ebce8dbf7b1af5efde0"} Dec 03 14:15:18 crc kubenswrapper[4751]: I1203 14:15:18.886257 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:15:18 crc kubenswrapper[4751]: I1203 14:15:18.911804 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podStartSLOduration=102.911787296 podStartE2EDuration="1m42.911787296s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:18.910386909 +0000 UTC m=+125.898742136" watchObservedRunningTime="2025-12-03 14:15:18.911787296 +0000 UTC m=+125.900142513" Dec 03 14:15:19 crc kubenswrapper[4751]: I1203 14:15:19.095076 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zgqdp"] Dec 03 14:15:19 crc kubenswrapper[4751]: I1203 14:15:19.095205 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:19 crc kubenswrapper[4751]: E1203 14:15:19.095285 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:19 crc kubenswrapper[4751]: I1203 14:15:19.313808 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:19 crc kubenswrapper[4751]: E1203 14:15:19.313953 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:19 crc kubenswrapper[4751]: I1203 14:15:19.314178 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:19 crc kubenswrapper[4751]: E1203 14:15:19.314239 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:19 crc kubenswrapper[4751]: I1203 14:15:19.314413 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:19 crc kubenswrapper[4751]: E1203 14:15:19.314476 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:21 crc kubenswrapper[4751]: I1203 14:15:21.313463 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:21 crc kubenswrapper[4751]: I1203 14:15:21.313575 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:21 crc kubenswrapper[4751]: I1203 14:15:21.313642 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:21 crc kubenswrapper[4751]: E1203 14:15:21.313692 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:21 crc kubenswrapper[4751]: I1203 14:15:21.313582 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:21 crc kubenswrapper[4751]: E1203 14:15:21.313828 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:21 crc kubenswrapper[4751]: E1203 14:15:21.313960 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:21 crc kubenswrapper[4751]: E1203 14:15:21.314107 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:22 crc kubenswrapper[4751]: I1203 14:15:22.314458 4751 scope.go:117] "RemoveContainer" containerID="f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801" Dec 03 14:15:22 crc kubenswrapper[4751]: I1203 14:15:22.899245 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98mjq_6a216adb-632d-4134-8c61-61fe6b8c5f71/kube-multus/1.log" Dec 03 14:15:22 crc kubenswrapper[4751]: I1203 14:15:22.899626 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98mjq" event={"ID":"6a216adb-632d-4134-8c61-61fe6b8c5f71","Type":"ContainerStarted","Data":"f3ddb4e9890a3a646a504a20509bca72cba34c27e51e9ed333c969148058d81a"} Dec 03 14:15:23 crc kubenswrapper[4751]: I1203 14:15:23.313452 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:23 crc kubenswrapper[4751]: I1203 14:15:23.313470 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:23 crc kubenswrapper[4751]: I1203 14:15:23.313453 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:23 crc kubenswrapper[4751]: I1203 14:15:23.314874 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:23 crc kubenswrapper[4751]: E1203 14:15:23.314904 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:23 crc kubenswrapper[4751]: E1203 14:15:23.314984 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:23 crc kubenswrapper[4751]: E1203 14:15:23.315052 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:23 crc kubenswrapper[4751]: E1203 14:15:23.315639 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:23 crc kubenswrapper[4751]: E1203 14:15:23.407460 4751 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:15:25 crc kubenswrapper[4751]: I1203 14:15:25.313205 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:25 crc kubenswrapper[4751]: I1203 14:15:25.313257 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:25 crc kubenswrapper[4751]: I1203 14:15:25.313311 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:25 crc kubenswrapper[4751]: I1203 14:15:25.313368 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:25 crc kubenswrapper[4751]: E1203 14:15:25.313389 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:25 crc kubenswrapper[4751]: E1203 14:15:25.313503 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:25 crc kubenswrapper[4751]: E1203 14:15:25.313635 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:25 crc kubenswrapper[4751]: E1203 14:15:25.313696 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:27 crc kubenswrapper[4751]: I1203 14:15:27.313062 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:27 crc kubenswrapper[4751]: E1203 14:15:27.313227 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zgqdp" podUID="45fb8744-4cb9-4138-8310-c02f7c6a2941" Dec 03 14:15:27 crc kubenswrapper[4751]: I1203 14:15:27.313312 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:27 crc kubenswrapper[4751]: I1203 14:15:27.313395 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:27 crc kubenswrapper[4751]: E1203 14:15:27.313462 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:15:27 crc kubenswrapper[4751]: I1203 14:15:27.313346 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:27 crc kubenswrapper[4751]: E1203 14:15:27.313679 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:15:27 crc kubenswrapper[4751]: E1203 14:15:27.313805 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.313828 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.313878 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.313976 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.314106 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.317695 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.319010 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.319377 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.319426 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.319545 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 14:15:29 crc kubenswrapper[4751]: I1203 14:15:29.323368 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 14:15:33 crc kubenswrapper[4751]: I1203 14:15:33.806771 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.611589 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.662400 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fwp52"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.662962 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.663024 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zcq45"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.663401 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.664782 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2n4v9"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.665149 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.669411 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.670677 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-login": failed to list *v1.Secret: secrets "v4-0-config-user-template-login" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.670805 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-login\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.670945 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data": failed to list *v1.Secret: secrets "v4-0-config-user-idp-0-file-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.671025 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-idp-0-file-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.671182 4751 reflector.go:561] object-"openshift-authentication"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.671282 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.671420 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.671635 4751 reflector.go:561] object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc": failed to list *v1.Secret: secrets "oauth-openshift-dockercfg-znhcc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.671724 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-znhcc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-openshift-dockercfg-znhcc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.671832 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-session": failed to list *v1.Secret: secrets "v4-0-config-system-session" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.671905 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-session\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672048 4751 reflector.go:561] object-"openshift-authentication"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.672139 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672262 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.672357 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.672695 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672685 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-error": failed to list *v1.Secret: secrets "v4-0-config-user-template-error" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672973 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-service-ca": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-service-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672986 4751 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673010 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-error\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673033 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-service-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.672790 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673090 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672814 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-serving-cert": failed to list *v1.Secret: secrets "v4-0-config-system-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673142 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672862 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-cliconfig": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-cliconfig" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673208 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-cliconfig\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672913 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template": failed to list *v1.Secret: secrets "v4-0-config-system-ocp-branding-template" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673260 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-ocp-branding-template\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672928 4751 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673305 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.672944 4751 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673367 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.673797 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.673798 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-router-certs": failed to list *v1.Secret: secrets "v4-0-config-system-router-certs" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.673905 4751 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673953 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-router-certs\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.673971 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.673812 4751 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.673997 4751 reflector.go:561] object-"openshift-authentication"/"audit": failed to list *v1.ConfigMap: configmaps "audit" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.674008 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.673848 4751 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.674036 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.674018 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"audit\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.676615 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv"] Dec 03 14:15:37 crc kubenswrapper[4751]: W1203 14:15:37.674024 4751 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-provider-selection": failed to list *v1.Secret: secrets "v4-0-config-user-template-provider-selection" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 03 14:15:37 crc kubenswrapper[4751]: E1203 14:15:37.676785 4751 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-provider-selection\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.674086 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.677003 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.677207 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.677268 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6zks7"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.677404 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.677831 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.677840 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.686906 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.687817 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.689572 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.689672 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.689874 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.689966 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.690028 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.689941 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.690178 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.690228 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.690345 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.690899 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.690907 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.691056 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.692393 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.694216 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.694227 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.694444 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-djkdm"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.703708 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.703946 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.704105 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.706760 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.707682 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.724782 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.725420 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.725429 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.725508 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.726246 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.726697 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.726929 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.727513 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pnmnf"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.727542 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.728394 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.728527 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.728651 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.728724 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mc2fd"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.729031 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.729084 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.729144 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mc2fd" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.729586 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.730375 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.731203 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.731963 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.732617 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.732997 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.733223 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.733307 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.733465 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.733538 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.733692 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.733516 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.734360 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.733740 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.733903 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.734867 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.734943 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.735062 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.735163 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.735278 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.735456 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.735691 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.735645 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.737715 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.738171 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.738391 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.738664 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.740545 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.741075 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.741229 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.741377 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.741523 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.741807 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.741977 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.742101 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.744567 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pwknx"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.744927 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.746655 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.747113 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-c428g"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.747213 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.747542 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.747823 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.748104 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fwp52"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.750351 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zcq45"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.750378 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.751412 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.753615 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.753821 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.753893 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.753841 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754064 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754098 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.753863 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754213 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754386 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754428 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754488 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754554 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754742 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754782 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.754887 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.759144 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.765224 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-djkdm"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.771296 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.772621 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pnmnf"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.773893 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.785838 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-c428g"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.786476 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.788287 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6zks7"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.790429 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mc2fd"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.792740 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2n4v9"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.794394 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.795683 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.796932 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.798584 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pwknx"] Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.805033 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.824901 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ca258dc-b8d3-417d-9a07-9d538a778e41-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q2b78\" (UID: \"9ca258dc-b8d3-417d-9a07-9d538a778e41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.824946 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7821f968-1e72-4450-9a62-9f60d11121c3-serving-cert\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.824976 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb5bca62-c4b9-4e79-a752-96185f22b757-serving-cert\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.824997 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr7rd\" (UniqueName: \"kubernetes.io/projected/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-kube-api-access-kr7rd\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825024 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-config\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825046 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5lsn\" (UniqueName: \"kubernetes.io/projected/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-kube-api-access-c5lsn\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825071 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ced64e5-348e-4211-bbeb-9853697b75e3-audit-dir\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825090 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hcbf\" (UniqueName: \"kubernetes.io/projected/eb5bca62-c4b9-4e79-a752-96185f22b757-kube-api-access-9hcbf\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825110 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0ced64e5-348e-4211-bbeb-9853697b75e3-etcd-client\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825136 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-policies\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825167 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825196 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-config\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825282 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7821f968-1e72-4450-9a62-9f60d11121c3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825394 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825450 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh45m\" (UniqueName: \"kubernetes.io/projected/a112208b-e069-48e0-8bc4-d6c4e79052fc-kube-api-access-nh45m\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825542 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825586 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0ced64e5-348e-4211-bbeb-9853697b75e3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825607 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-serving-cert\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825637 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825662 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-service-ca\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825701 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnl74\" (UniqueName: \"kubernetes.io/projected/ab66f440-24ed-4244-a972-63eee27b67b1-kube-api-access-fnl74\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825740 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-dir\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825762 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca258dc-b8d3-417d-9a07-9d538a778e41-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q2b78\" (UID: \"9ca258dc-b8d3-417d-9a07-9d538a778e41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825833 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-images\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825877 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0ced64e5-348e-4211-bbeb-9853697b75e3-encryption-config\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825929 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-oauth-serving-cert\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825960 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825985 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ced64e5-348e-4211-bbeb-9853697b75e3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.825994 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826018 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826079 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-config\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826101 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab66f440-24ed-4244-a972-63eee27b67b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826124 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84bqs\" (UniqueName: \"kubernetes.io/projected/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-kube-api-access-84bqs\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826160 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ced64e5-348e-4211-bbeb-9853697b75e3-audit-policies\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826177 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-client-ca\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826199 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826226 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826263 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826319 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r9vx\" (UniqueName: \"kubernetes.io/projected/0ced64e5-348e-4211-bbeb-9853697b75e3-kube-api-access-6r9vx\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826403 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-trusted-ca-bundle\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826470 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2qk4\" (UniqueName: \"kubernetes.io/projected/9ca258dc-b8d3-417d-9a07-9d538a778e41-kube-api-access-n2qk4\") pod \"openshift-apiserver-operator-796bbdcf4f-q2b78\" (UID: \"9ca258dc-b8d3-417d-9a07-9d538a778e41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826503 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826524 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-serving-cert\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826544 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826568 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7821f968-1e72-4450-9a62-9f60d11121c3-config\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826629 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-config\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826664 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826709 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826735 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826777 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ced64e5-348e-4211-bbeb-9853697b75e3-serving-cert\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826802 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3af90155-5ff4-4079-896c-c6a03fcaa809-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zzcqh\" (UID: \"3af90155-5ff4-4079-896c-c6a03fcaa809\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826852 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-client-ca\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826876 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-oauth-config\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826909 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlqbj\" (UniqueName: \"kubernetes.io/projected/7821f968-1e72-4450-9a62-9f60d11121c3-kube-api-access-rlqbj\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826935 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchkh\" (UniqueName: \"kubernetes.io/projected/3af90155-5ff4-4079-896c-c6a03fcaa809-kube-api-access-zchkh\") pod \"cluster-samples-operator-665b6dd947-zzcqh\" (UID: \"3af90155-5ff4-4079-896c-c6a03fcaa809\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826969 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7821f968-1e72-4450-9a62-9f60d11121c3-service-ca-bundle\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.826995 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.851078 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.864886 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928471 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0ced64e5-348e-4211-bbeb-9853697b75e3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928510 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-serving-cert\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928533 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-machine-approver-tls\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928552 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsv7\" (UniqueName: \"kubernetes.io/projected/bea4ae1a-c496-47de-9d12-1d6d42793bd2-kube-api-access-clsv7\") pod \"openshift-config-operator-7777fb866f-g4jcg\" (UID: \"bea4ae1a-c496-47de-9d12-1d6d42793bd2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928571 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928590 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-service-ca\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928610 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnl74\" (UniqueName: \"kubernetes.io/projected/ab66f440-24ed-4244-a972-63eee27b67b1-kube-api-access-fnl74\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928626 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-dir\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928644 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca258dc-b8d3-417d-9a07-9d538a778e41-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q2b78\" (UID: \"9ca258dc-b8d3-417d-9a07-9d538a778e41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928668 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0ced64e5-348e-4211-bbeb-9853697b75e3-encryption-config\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928683 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea4ae1a-c496-47de-9d12-1d6d42793bd2-serving-cert\") pod \"openshift-config-operator-7777fb866f-g4jcg\" (UID: \"bea4ae1a-c496-47de-9d12-1d6d42793bd2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928701 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-images\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928726 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4784p\" (UniqueName: \"kubernetes.io/projected/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-kube-api-access-4784p\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928743 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18505c2b-ef26-4f6e-8471-21a77aab056b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-czhpn\" (UID: \"18505c2b-ef26-4f6e-8471-21a77aab056b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928761 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928781 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928809 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-dir\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.928974 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ced64e5-348e-4211-bbeb-9853697b75e3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.929105 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-oauth-serving-cert\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.929398 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03d2bc09-2a61-4820-a697-7d2c5242acdb-serving-cert\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.929449 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.929573 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-serving-cert\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.929622 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab66f440-24ed-4244-a972-63eee27b67b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.929642 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84bqs\" (UniqueName: \"kubernetes.io/projected/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-kube-api-access-84bqs\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.929879 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-config\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.929907 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-client-ca\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930034 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930118 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bea4ae1a-c496-47de-9d12-1d6d42793bd2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g4jcg\" (UID: \"bea4ae1a-c496-47de-9d12-1d6d42793bd2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930405 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ced64e5-348e-4211-bbeb-9853697b75e3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930440 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0ced64e5-348e-4211-bbeb-9853697b75e3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930510 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ced64e5-348e-4211-bbeb-9853697b75e3-audit-policies\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930646 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-node-pullsecrets\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930758 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930789 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930722 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-client-ca\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930825 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r9vx\" (UniqueName: \"kubernetes.io/projected/0ced64e5-348e-4211-bbeb-9853697b75e3-kube-api-access-6r9vx\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930849 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-trusted-ca-bundle\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930872 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930890 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-serving-cert\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930909 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2qk4\" (UniqueName: \"kubernetes.io/projected/9ca258dc-b8d3-417d-9a07-9d538a778e41-kube-api-access-n2qk4\") pod \"openshift-apiserver-operator-796bbdcf4f-q2b78\" (UID: \"9ca258dc-b8d3-417d-9a07-9d538a778e41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930926 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7821f968-1e72-4450-9a62-9f60d11121c3-config\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930942 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-config\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930960 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930981 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.930999 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-config\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931025 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931052 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931074 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/73222926-1acd-41d2-8b69-79ed24aaf6d5-metrics-tls\") pod \"dns-operator-744455d44c-pnmnf\" (UID: \"73222926-1acd-41d2-8b69-79ed24aaf6d5\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931092 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-image-import-ca\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931114 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ced64e5-348e-4211-bbeb-9853697b75e3-serving-cert\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931139 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3af90155-5ff4-4079-896c-c6a03fcaa809-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zzcqh\" (UID: \"3af90155-5ff4-4079-896c-c6a03fcaa809\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931158 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d2bc09-2a61-4820-a697-7d2c5242acdb-config\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931188 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-client-ca\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931209 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-oauth-config\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931230 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlqbj\" (UniqueName: \"kubernetes.io/projected/7821f968-1e72-4450-9a62-9f60d11121c3-kube-api-access-rlqbj\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931251 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zchkh\" (UniqueName: \"kubernetes.io/projected/3af90155-5ff4-4079-896c-c6a03fcaa809-kube-api-access-zchkh\") pod \"cluster-samples-operator-665b6dd947-zzcqh\" (UID: \"3af90155-5ff4-4079-896c-c6a03fcaa809\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931273 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-encryption-config\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931293 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931308 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d2bc09-2a61-4820-a697-7d2c5242acdb-trusted-ca\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931311 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ced64e5-348e-4211-bbeb-9853697b75e3-audit-policies\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931339 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc4kb\" (UniqueName: \"kubernetes.io/projected/18505c2b-ef26-4f6e-8471-21a77aab056b-kube-api-access-dc4kb\") pod \"openshift-controller-manager-operator-756b6f6bc6-czhpn\" (UID: \"18505c2b-ef26-4f6e-8471-21a77aab056b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931399 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7821f968-1e72-4450-9a62-9f60d11121c3-service-ca-bundle\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931427 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ca258dc-b8d3-417d-9a07-9d538a778e41-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q2b78\" (UID: \"9ca258dc-b8d3-417d-9a07-9d538a778e41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931455 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-auth-proxy-config\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931484 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7821f968-1e72-4450-9a62-9f60d11121c3-serving-cert\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931470 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931506 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-audit\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931530 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cznx\" (UniqueName: \"kubernetes.io/projected/73222926-1acd-41d2-8b69-79ed24aaf6d5-kube-api-access-6cznx\") pod \"dns-operator-744455d44c-pnmnf\" (UID: \"73222926-1acd-41d2-8b69-79ed24aaf6d5\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931580 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-etcd-client\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931605 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18505c2b-ef26-4f6e-8471-21a77aab056b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-czhpn\" (UID: \"18505c2b-ef26-4f6e-8471-21a77aab056b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931631 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb5bca62-c4b9-4e79-a752-96185f22b757-serving-cert\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931665 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-oauth-serving-cert\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931663 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr7rd\" (UniqueName: \"kubernetes.io/projected/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-kube-api-access-kr7rd\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931744 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-config\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931783 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ced64e5-348e-4211-bbeb-9853697b75e3-audit-dir\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931820 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5lsn\" (UniqueName: \"kubernetes.io/projected/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-kube-api-access-c5lsn\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931861 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hcbf\" (UniqueName: \"kubernetes.io/projected/eb5bca62-c4b9-4e79-a752-96185f22b757-kube-api-access-9hcbf\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931893 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0ced64e5-348e-4211-bbeb-9853697b75e3-etcd-client\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931932 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk9ht\" (UniqueName: \"kubernetes.io/projected/03d2bc09-2a61-4820-a697-7d2c5242acdb-kube-api-access-wk9ht\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931966 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm24g\" (UniqueName: \"kubernetes.io/projected/a9f280b6-c725-4857-a658-6f3073b30fdf-kube-api-access-nm24g\") pod \"downloads-7954f5f757-mc2fd\" (UID: \"a9f280b6-c725-4857-a658-6f3073b30fdf\") " pod="openshift-console/downloads-7954f5f757-mc2fd" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.931999 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-config\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932034 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-policies\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932067 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932101 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-config\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932133 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6wql\" (UniqueName: \"kubernetes.io/projected/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-kube-api-access-v6wql\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932168 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7821f968-1e72-4450-9a62-9f60d11121c3-config\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932203 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7821f968-1e72-4450-9a62-9f60d11121c3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932239 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932274 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh45m\" (UniqueName: \"kubernetes.io/projected/a112208b-e069-48e0-8bc4-d6c4e79052fc-kube-api-access-nh45m\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932308 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-etcd-serving-ca\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932318 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7821f968-1e72-4450-9a62-9f60d11121c3-service-ca-bundle\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932376 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-audit-dir\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932420 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.933012 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ca258dc-b8d3-417d-9a07-9d538a778e41-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q2b78\" (UID: \"9ca258dc-b8d3-417d-9a07-9d538a778e41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.933199 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ced64e5-348e-4211-bbeb-9853697b75e3-audit-dir\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.934225 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-config\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.934656 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-serving-cert\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.932380 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-service-ca\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.936421 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7821f968-1e72-4450-9a62-9f60d11121c3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.936429 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.936501 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3af90155-5ff4-4079-896c-c6a03fcaa809-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zzcqh\" (UID: \"3af90155-5ff4-4079-896c-c6a03fcaa809\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.936510 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-client-ca\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.936906 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-trusted-ca-bundle\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.936997 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-config\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.938035 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7821f968-1e72-4450-9a62-9f60d11121c3-serving-cert\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.938287 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca258dc-b8d3-417d-9a07-9d538a778e41-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q2b78\" (UID: \"9ca258dc-b8d3-417d-9a07-9d538a778e41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.938538 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-oauth-config\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.938823 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb5bca62-c4b9-4e79-a752-96185f22b757-serving-cert\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.939185 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-config\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.940779 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0ced64e5-348e-4211-bbeb-9853697b75e3-etcd-client\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.941466 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0ced64e5-348e-4211-bbeb-9853697b75e3-encryption-config\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.941971 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-serving-cert\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.942241 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.943796 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ced64e5-348e-4211-bbeb-9853697b75e3-serving-cert\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.947209 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r9vx\" (UniqueName: \"kubernetes.io/projected/0ced64e5-348e-4211-bbeb-9853697b75e3-kube-api-access-6r9vx\") pod \"apiserver-7bbb656c7d-vq9xx\" (UID: \"0ced64e5-348e-4211-bbeb-9853697b75e3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.952690 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84bqs\" (UniqueName: \"kubernetes.io/projected/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-kube-api-access-84bqs\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:37 crc kubenswrapper[4751]: I1203 14:15:37.960723 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2qk4\" (UniqueName: \"kubernetes.io/projected/9ca258dc-b8d3-417d-9a07-9d538a778e41-kube-api-access-n2qk4\") pod \"openshift-apiserver-operator-796bbdcf4f-q2b78\" (UID: \"9ca258dc-b8d3-417d-9a07-9d538a778e41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.004537 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlqbj\" (UniqueName: \"kubernetes.io/projected/7821f968-1e72-4450-9a62-9f60d11121c3-kube-api-access-rlqbj\") pod \"authentication-operator-69f744f599-6zks7\" (UID: \"7821f968-1e72-4450-9a62-9f60d11121c3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.020772 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5lsn\" (UniqueName: \"kubernetes.io/projected/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-kube-api-access-c5lsn\") pod \"route-controller-manager-6576b87f9c-pqqnv\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034387 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-config\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034459 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/73222926-1acd-41d2-8b69-79ed24aaf6d5-metrics-tls\") pod \"dns-operator-744455d44c-pnmnf\" (UID: \"73222926-1acd-41d2-8b69-79ed24aaf6d5\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034486 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-image-import-ca\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034514 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d2bc09-2a61-4820-a697-7d2c5242acdb-config\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034561 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-encryption-config\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034593 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d2bc09-2a61-4820-a697-7d2c5242acdb-trusted-ca\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034616 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc4kb\" (UniqueName: \"kubernetes.io/projected/18505c2b-ef26-4f6e-8471-21a77aab056b-kube-api-access-dc4kb\") pod \"openshift-controller-manager-operator-756b6f6bc6-czhpn\" (UID: \"18505c2b-ef26-4f6e-8471-21a77aab056b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034640 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-auth-proxy-config\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034665 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-audit\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034694 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cznx\" (UniqueName: \"kubernetes.io/projected/73222926-1acd-41d2-8b69-79ed24aaf6d5-kube-api-access-6cznx\") pod \"dns-operator-744455d44c-pnmnf\" (UID: \"73222926-1acd-41d2-8b69-79ed24aaf6d5\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034715 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-etcd-client\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034737 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18505c2b-ef26-4f6e-8471-21a77aab056b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-czhpn\" (UID: \"18505c2b-ef26-4f6e-8471-21a77aab056b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034797 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk9ht\" (UniqueName: \"kubernetes.io/projected/03d2bc09-2a61-4820-a697-7d2c5242acdb-kube-api-access-wk9ht\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034828 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm24g\" (UniqueName: \"kubernetes.io/projected/a9f280b6-c725-4857-a658-6f3073b30fdf-kube-api-access-nm24g\") pod \"downloads-7954f5f757-mc2fd\" (UID: \"a9f280b6-c725-4857-a658-6f3073b30fdf\") " pod="openshift-console/downloads-7954f5f757-mc2fd" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034853 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-config\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034907 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6wql\" (UniqueName: \"kubernetes.io/projected/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-kube-api-access-v6wql\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034952 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-etcd-serving-ca\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034978 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-config\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.034983 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-audit-dir\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035049 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-audit-dir\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035051 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-machine-approver-tls\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035125 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clsv7\" (UniqueName: \"kubernetes.io/projected/bea4ae1a-c496-47de-9d12-1d6d42793bd2-kube-api-access-clsv7\") pod \"openshift-config-operator-7777fb866f-g4jcg\" (UID: \"bea4ae1a-c496-47de-9d12-1d6d42793bd2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035218 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea4ae1a-c496-47de-9d12-1d6d42793bd2-serving-cert\") pod \"openshift-config-operator-7777fb866f-g4jcg\" (UID: \"bea4ae1a-c496-47de-9d12-1d6d42793bd2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035266 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4784p\" (UniqueName: \"kubernetes.io/projected/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-kube-api-access-4784p\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035298 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035388 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18505c2b-ef26-4f6e-8471-21a77aab056b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-czhpn\" (UID: \"18505c2b-ef26-4f6e-8471-21a77aab056b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035439 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03d2bc09-2a61-4820-a697-7d2c5242acdb-serving-cert\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035490 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-serving-cert\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035564 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bea4ae1a-c496-47de-9d12-1d6d42793bd2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g4jcg\" (UID: \"bea4ae1a-c496-47de-9d12-1d6d42793bd2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035614 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-node-pullsecrets\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.035727 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-node-pullsecrets\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.037839 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-machine-approver-tls\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.037846 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-auth-proxy-config\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.037952 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-etcd-serving-ca\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.038499 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-config\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.038678 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d2bc09-2a61-4820-a697-7d2c5242acdb-config\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.038894 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bea4ae1a-c496-47de-9d12-1d6d42793bd2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g4jcg\" (UID: \"bea4ae1a-c496-47de-9d12-1d6d42793bd2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.039372 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d2bc09-2a61-4820-a697-7d2c5242acdb-trusted-ca\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.039308 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-image-import-ca\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.039660 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18505c2b-ef26-4f6e-8471-21a77aab056b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-czhpn\" (UID: \"18505c2b-ef26-4f6e-8471-21a77aab056b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.040163 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18505c2b-ef26-4f6e-8471-21a77aab056b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-czhpn\" (UID: \"18505c2b-ef26-4f6e-8471-21a77aab056b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.040515 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-audit\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.040631 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03d2bc09-2a61-4820-a697-7d2c5242acdb-serving-cert\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.041376 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-etcd-client\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.042192 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/73222926-1acd-41d2-8b69-79ed24aaf6d5-metrics-tls\") pod \"dns-operator-744455d44c-pnmnf\" (UID: \"73222926-1acd-41d2-8b69-79ed24aaf6d5\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.043156 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea4ae1a-c496-47de-9d12-1d6d42793bd2-serving-cert\") pod \"openshift-config-operator-7777fb866f-g4jcg\" (UID: \"bea4ae1a-c496-47de-9d12-1d6d42793bd2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.043639 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.045314 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-encryption-config\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.045378 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-serving-cert\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.049252 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hcbf\" (UniqueName: \"kubernetes.io/projected/eb5bca62-c4b9-4e79-a752-96185f22b757-kube-api-access-9hcbf\") pod \"controller-manager-879f6c89f-fwp52\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.054751 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.067121 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchkh\" (UniqueName: \"kubernetes.io/projected/3af90155-5ff4-4079-896c-c6a03fcaa809-kube-api-access-zchkh\") pod \"cluster-samples-operator-665b6dd947-zzcqh\" (UID: \"3af90155-5ff4-4079-896c-c6a03fcaa809\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.074592 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.084889 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh45m\" (UniqueName: \"kubernetes.io/projected/a112208b-e069-48e0-8bc4-d6c4e79052fc-kube-api-access-nh45m\") pod \"console-f9d7485db-djkdm\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.091988 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.101690 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.101911 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e3bbe41-d57d-447e-8fe2-b7dab3af07f5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-b96kv\" (UID: \"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.123023 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm24g\" (UniqueName: \"kubernetes.io/projected/a9f280b6-c725-4857-a658-6f3073b30fdf-kube-api-access-nm24g\") pod \"downloads-7954f5f757-mc2fd\" (UID: \"a9f280b6-c725-4857-a658-6f3073b30fdf\") " pod="openshift-console/downloads-7954f5f757-mc2fd" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.124132 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mc2fd" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.132086 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.148647 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4784p\" (UniqueName: \"kubernetes.io/projected/90549328-bfce-4bd2-b1cc-651a2c9cd2e1-kube-api-access-4784p\") pod \"apiserver-76f77b778f-pwknx\" (UID: \"90549328-bfce-4bd2-b1cc-651a2c9cd2e1\") " pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.149551 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.154353 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.155081 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.167265 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5kgjd"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.168273 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.172131 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hvd8w"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.172708 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.193241 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.193602 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2hknc"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.193823 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.194179 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.194923 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6wql\" (UniqueName: \"kubernetes.io/projected/1ba52da9-74a9-4dbc-aad3-649ecf8f67be-kube-api-access-v6wql\") pod \"machine-approver-56656f9798-wshkr\" (UID: \"1ba52da9-74a9-4dbc-aad3-649ecf8f67be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.195460 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.195728 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.197850 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.198891 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.207280 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.213393 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsv7\" (UniqueName: \"kubernetes.io/projected/bea4ae1a-c496-47de-9d12-1d6d42793bd2-kube-api-access-clsv7\") pod \"openshift-config-operator-7777fb866f-g4jcg\" (UID: \"bea4ae1a-c496-47de-9d12-1d6d42793bd2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.215771 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.215807 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk9ht\" (UniqueName: \"kubernetes.io/projected/03d2bc09-2a61-4820-a697-7d2c5242acdb-kube-api-access-wk9ht\") pod \"console-operator-58897d9998-c428g\" (UID: \"03d2bc09-2a61-4820-a697-7d2c5242acdb\") " pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.216222 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.218402 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.220867 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.225150 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.228050 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.230077 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.236268 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qjtlv"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.237663 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.248151 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cznx\" (UniqueName: \"kubernetes.io/projected/73222926-1acd-41d2-8b69-79ed24aaf6d5-kube-api-access-6cznx\") pod \"dns-operator-744455d44c-pnmnf\" (UID: \"73222926-1acd-41d2-8b69-79ed24aaf6d5\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.251609 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc4kb\" (UniqueName: \"kubernetes.io/projected/18505c2b-ef26-4f6e-8471-21a77aab056b-kube-api-access-dc4kb\") pod \"openshift-controller-manager-operator-756b6f6bc6-czhpn\" (UID: \"18505c2b-ef26-4f6e-8471-21a77aab056b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.261149 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.261633 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.261899 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-56xrz"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.262234 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.262826 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.262968 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.264542 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.265357 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.265575 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.265790 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.266349 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.269509 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c725"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.269879 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5kgjd"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.269935 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.269961 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7nphl"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.270540 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7nphl" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.272914 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.273433 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.273764 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.273927 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.277078 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.277121 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.277750 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4ggm5"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.277881 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.278429 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.279133 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.281683 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2jgqz"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.282188 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7clk4"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.282452 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.282970 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.283008 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.282971 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2hknc"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.283573 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.284944 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.286529 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hvd8w"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.288464 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.292697 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.293934 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.294978 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2jgqz"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.296031 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.297074 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4ggm5"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.298165 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7nphl"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.299427 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c725"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.300867 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.302044 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.303399 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.305189 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.305948 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.306787 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.308025 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.309535 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.311257 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.313839 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7clk4"] Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.326405 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.346754 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353504 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjqsl\" (UniqueName: \"kubernetes.io/projected/09111b55-9bde-41bf-8f13-51baf359a20c-kube-api-access-zjqsl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6jx7f\" (UID: \"09111b55-9bde-41bf-8f13-51baf359a20c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353550 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad26e28c-197b-4671-bd99-109e1c3527d7-config\") pod \"kube-apiserver-operator-766d6c64bb-5dmt6\" (UID: \"ad26e28c-197b-4671-bd99-109e1c3527d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353584 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9xcz\" (UniqueName: \"kubernetes.io/projected/a35c0587-d301-49f5-b7a2-3d7d32efed87-kube-api-access-f9xcz\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353615 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad26e28c-197b-4671-bd99-109e1c3527d7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5dmt6\" (UID: \"ad26e28c-197b-4671-bd99-109e1c3527d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353648 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a35c0587-d301-49f5-b7a2-3d7d32efed87-proxy-tls\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353683 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a35c0587-d301-49f5-b7a2-3d7d32efed87-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353772 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad26e28c-197b-4671-bd99-109e1c3527d7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5dmt6\" (UID: \"ad26e28c-197b-4671-bd99-109e1c3527d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353841 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09111b55-9bde-41bf-8f13-51baf359a20c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6jx7f\" (UID: \"09111b55-9bde-41bf-8f13-51baf359a20c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353875 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a35c0587-d301-49f5-b7a2-3d7d32efed87-images\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.353938 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09111b55-9bde-41bf-8f13-51baf359a20c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6jx7f\" (UID: \"09111b55-9bde-41bf-8f13-51baf359a20c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.360600 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.366790 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.385827 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.390896 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.405632 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.410694 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.416749 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.426000 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.431439 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.444498 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.447622 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.450407 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455181 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09111b55-9bde-41bf-8f13-51baf359a20c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6jx7f\" (UID: \"09111b55-9bde-41bf-8f13-51baf359a20c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455365 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a35c0587-d301-49f5-b7a2-3d7d32efed87-images\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455393 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09111b55-9bde-41bf-8f13-51baf359a20c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6jx7f\" (UID: \"09111b55-9bde-41bf-8f13-51baf359a20c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455437 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjqsl\" (UniqueName: \"kubernetes.io/projected/09111b55-9bde-41bf-8f13-51baf359a20c-kube-api-access-zjqsl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6jx7f\" (UID: \"09111b55-9bde-41bf-8f13-51baf359a20c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455592 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad26e28c-197b-4671-bd99-109e1c3527d7-config\") pod \"kube-apiserver-operator-766d6c64bb-5dmt6\" (UID: \"ad26e28c-197b-4671-bd99-109e1c3527d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455618 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9xcz\" (UniqueName: \"kubernetes.io/projected/a35c0587-d301-49f5-b7a2-3d7d32efed87-kube-api-access-f9xcz\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455702 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad26e28c-197b-4671-bd99-109e1c3527d7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5dmt6\" (UID: \"ad26e28c-197b-4671-bd99-109e1c3527d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455760 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a35c0587-d301-49f5-b7a2-3d7d32efed87-proxy-tls\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455825 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a35c0587-d301-49f5-b7a2-3d7d32efed87-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.455866 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad26e28c-197b-4671-bd99-109e1c3527d7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5dmt6\" (UID: \"ad26e28c-197b-4671-bd99-109e1c3527d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.465792 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.478196 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a35c0587-d301-49f5-b7a2-3d7d32efed87-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.490991 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.505854 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.527394 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.548996 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.572696 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.587045 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 14:15:38 crc kubenswrapper[4751]: W1203 14:15:38.598717 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ba52da9_74a9_4dbc_aad3_649ecf8f67be.slice/crio-40d718098746765e18270e546e83ef896133936ea75857bc6bad93f778699562 WatchSource:0}: Error finding container 40d718098746765e18270e546e83ef896133936ea75857bc6bad93f778699562: Status 404 returned error can't find the container with id 40d718098746765e18270e546e83ef896133936ea75857bc6bad93f778699562 Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.605997 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.626235 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.646411 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.685200 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.685590 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.707215 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.725906 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.726742 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad26e28c-197b-4671-bd99-109e1c3527d7-config\") pod \"kube-apiserver-operator-766d6c64bb-5dmt6\" (UID: \"ad26e28c-197b-4671-bd99-109e1c3527d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.739880 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad26e28c-197b-4671-bd99-109e1c3527d7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5dmt6\" (UID: \"ad26e28c-197b-4671-bd99-109e1c3527d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.746474 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.765445 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.786476 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.806699 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.828649 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.845888 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a35c0587-d301-49f5-b7a2-3d7d32efed87-proxy-tls\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.846071 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.857903 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a35c0587-d301-49f5-b7a2-3d7d32efed87-images\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.868012 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.885363 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.905758 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.925996 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.929216 4751 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-login: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.929283 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-login podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.429263577 +0000 UTC m=+146.417618794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-login" (UniqueName: "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-login") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.929525 4751 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.929567 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-images podName:ab66f440-24ed-4244-a972-63eee27b67b1 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.429558265 +0000 UTC m=+146.417913482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-images") pod "machine-api-operator-5694c8668f-2n4v9" (UID: "ab66f440-24ed-4244-a972-63eee27b67b1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.929605 4751 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-router-certs: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.929630 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-router-certs podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.429623236 +0000 UTC m=+146.417978453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-router-certs" (UniqueName: "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-router-certs") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.930095 4751 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.930206 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-config podName:ab66f440-24ed-4244-a972-63eee27b67b1 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.430182331 +0000 UTC m=+146.418537548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-config") pod "machine-api-operator-5694c8668f-2n4v9" (UID: "ab66f440-24ed-4244-a972-63eee27b67b1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.930251 4751 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.930287 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab66f440-24ed-4244-a972-63eee27b67b1-machine-api-operator-tls podName:ab66f440-24ed-4244-a972-63eee27b67b1 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.430276024 +0000 UTC m=+146.418631241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/ab66f440-24ed-4244-a972-63eee27b67b1-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-2n4v9" (UID: "ab66f440-24ed-4244-a972-63eee27b67b1") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.931050 4751 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-idp-0-file-data: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.931112 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-idp-0-file-data podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.431099486 +0000 UTC m=+146.419454873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-idp-0-file-data" (UniqueName: "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-idp-0-file-data") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.931147 4751 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-provider-selection: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.931179 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.431168117 +0000 UTC m=+146.419523544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-provider-selection" (UniqueName: "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.931236 4751 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.931279 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.4312709 +0000 UTC m=+146.419626117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.932034 4751 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.932075 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-error podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.432066061 +0000 UTC m=+146.420421278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-error") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.932093 4751 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.932112 4751 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-ocp-branding-template: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.932115 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-serving-cert podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.432109232 +0000 UTC m=+146.420464449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-serving-cert" (UniqueName: "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-serving-cert") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.932170 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-ocp-branding-template podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.432161494 +0000 UTC m=+146.420516901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-ocp-branding-template" (UniqueName: "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-ocp-branding-template") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.933455 4751 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.933461 4751 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.933497 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-service-ca podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.433487479 +0000 UTC m=+146.421842696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-service-ca") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.933522 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-session podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.433506919 +0000 UTC m=+146.421862326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-session") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.933552 4751 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.933647 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-policies podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.433625123 +0000 UTC m=+146.421980340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-policies") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.936391 4751 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.936442 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-trusted-ca-bundle podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:39.436430137 +0000 UTC m=+146.424785354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.948181 4751 projected.go:288] Couldn't get configMap openshift-machine-api/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.948292 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.966797 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.973943 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" event={"ID":"1ba52da9-74a9-4dbc-aad3-649ecf8f67be","Type":"ContainerStarted","Data":"7918dc070a98927af87c293270df5a5648dd0472d440404f2eaa01a62c60016b"} Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.973990 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" event={"ID":"1ba52da9-74a9-4dbc-aad3-649ecf8f67be","Type":"ContainerStarted","Data":"40d718098746765e18270e546e83ef896133936ea75857bc6bad93f778699562"} Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.976213 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78"] Dec 03 14:15:38 crc kubenswrapper[4751]: E1203 14:15:38.977774 4751 projected.go:288] Couldn't get configMap openshift-authentication/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:38 crc kubenswrapper[4751]: I1203 14:15:38.987575 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 14:15:39 crc kubenswrapper[4751]: W1203 14:15:39.008129 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca258dc_b8d3_417d_9a07_9d538a778e41.slice/crio-d7b60aa51fd973c77d310eadb2c4becdb5c39c7fc06ea20de4cbf492d78abf8a WatchSource:0}: Error finding container d7b60aa51fd973c77d310eadb2c4becdb5c39c7fc06ea20de4cbf492d78abf8a: Status 404 returned error can't find the container with id d7b60aa51fd973c77d310eadb2c4becdb5c39c7fc06ea20de4cbf492d78abf8a Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.009467 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.022227 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6zks7"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.026400 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.028110 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fwp52"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.031183 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mc2fd"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.031833 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.040392 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.041661 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv"] Dec 03 14:15:39 crc kubenswrapper[4751]: W1203 14:15:39.042854 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb5bca62_c4b9_4e79_a752_96185f22b757.slice/crio-34f1745ad1eba8e9f3cc55ce490ef1dd5671919d0e20f0341be439d40672dd22 WatchSource:0}: Error finding container 34f1745ad1eba8e9f3cc55ce490ef1dd5671919d0e20f0341be439d40672dd22: Status 404 returned error can't find the container with id 34f1745ad1eba8e9f3cc55ce490ef1dd5671919d0e20f0341be439d40672dd22 Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.044924 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.046602 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: W1203 14:15:39.057510 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ced64e5_348e_4211_bbeb_9853697b75e3.slice/crio-f33b0e6c66dc8005099b551b876c37a120307c0623f9dd32ed6442d23c71a4a1 WatchSource:0}: Error finding container f33b0e6c66dc8005099b551b876c37a120307c0623f9dd32ed6442d23c71a4a1: Status 404 returned error can't find the container with id f33b0e6c66dc8005099b551b876c37a120307c0623f9dd32ed6442d23c71a4a1 Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.066136 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: W1203 14:15:39.076129 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f280b6_c725_4857_a658_6f3073b30fdf.slice/crio-fbd160399b50100f203f7b65d8952c7662e403bf863233ebb99db1035812b88e WatchSource:0}: Error finding container fbd160399b50100f203f7b65d8952c7662e403bf863233ebb99db1035812b88e: Status 404 returned error can't find the container with id fbd160399b50100f203f7b65d8952c7662e403bf863233ebb99db1035812b88e Dec 03 14:15:39 crc kubenswrapper[4751]: W1203 14:15:39.086271 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e9f6fd3_3c79_48a8_a551_128f73f63dd7.slice/crio-6e8fb136d482a795e7e602bed6bc4a0dced61168b27a6f8fba3b021bae94b64d WatchSource:0}: Error finding container 6e8fb136d482a795e7e602bed6bc4a0dced61168b27a6f8fba3b021bae94b64d: Status 404 returned error can't find the container with id 6e8fb136d482a795e7e602bed6bc4a0dced61168b27a6f8fba3b021bae94b64d Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.086850 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.106766 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.124440 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.125412 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pnmnf"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.126315 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.132042 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-djkdm"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.138396 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pwknx"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.146431 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-c428g"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.148393 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.166448 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 14:15:39 crc kubenswrapper[4751]: W1203 14:15:39.171877 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73222926_1acd_41d2_8b69_79ed24aaf6d5.slice/crio-464ca3f6e8e0e35b975fa6aa6116ebd2ed6339405ca9520b6f5925dac2270cc8 WatchSource:0}: Error finding container 464ca3f6e8e0e35b975fa6aa6116ebd2ed6339405ca9520b6f5925dac2270cc8: Status 404 returned error can't find the container with id 464ca3f6e8e0e35b975fa6aa6116ebd2ed6339405ca9520b6f5925dac2270cc8 Dec 03 14:15:39 crc kubenswrapper[4751]: W1203 14:15:39.175351 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90549328_bfce_4bd2_b1cc_651a2c9cd2e1.slice/crio-b259835677e3fa81b89ed07ec998455146d973f56a14caf6bb66b13273315bef WatchSource:0}: Error finding container b259835677e3fa81b89ed07ec998455146d973f56a14caf6bb66b13273315bef: Status 404 returned error can't find the container with id b259835677e3fa81b89ed07ec998455146d973f56a14caf6bb66b13273315bef Dec 03 14:15:39 crc kubenswrapper[4751]: W1203 14:15:39.176709 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda112208b_e069_48e0_8bc4_d6c4e79052fc.slice/crio-cb27c8c07a1adac88b4af2dbc548d9f692e228595028b294d10e56f342fe57f7 WatchSource:0}: Error finding container cb27c8c07a1adac88b4af2dbc548d9f692e228595028b294d10e56f342fe57f7: Status 404 returned error can't find the container with id cb27c8c07a1adac88b4af2dbc548d9f692e228595028b294d10e56f342fe57f7 Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.186269 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.205085 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.239523 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg"] Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.245783 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 14:15:39 crc kubenswrapper[4751]: W1203 14:15:39.255755 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea4ae1a_c496_47de_9d12_1d6d42793bd2.slice/crio-cb15be09190f22d8b8241ed59d5f96e75d45c8ca7a60c1cf2fdc78015c5d0c61 WatchSource:0}: Error finding container cb15be09190f22d8b8241ed59d5f96e75d45c8ca7a60c1cf2fdc78015c5d0c61: Status 404 returned error can't find the container with id cb15be09190f22d8b8241ed59d5f96e75d45c8ca7a60c1cf2fdc78015c5d0c61 Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.263562 4751 request.go:700] Waited for 1.000521523s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.267616 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.271637 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.271995 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.272118 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.273259 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:39 crc kubenswrapper[4751]: E1203 14:15:39.273531 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:17:41.27349664 +0000 UTC m=+268.261851887 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.280855 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.299902 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.310223 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.322780 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09111b55-9bde-41bf-8f13-51baf359a20c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6jx7f\" (UID: \"09111b55-9bde-41bf-8f13-51baf359a20c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.326453 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.336344 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09111b55-9bde-41bf-8f13-51baf359a20c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6jx7f\" (UID: \"09111b55-9bde-41bf-8f13-51baf359a20c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.347452 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.366024 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.373224 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.373368 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.377107 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.378462 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.386308 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.408096 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.426915 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.445280 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.467875 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.474771 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.474812 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.474841 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.474863 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.474902 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-policies\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.474922 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.474947 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.474969 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.474999 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-images\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.475040 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.475056 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.475086 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-config\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.475105 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab66f440-24ed-4244-a972-63eee27b67b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.475130 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.475156 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.486797 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.505970 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.526984 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.531574 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.538991 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.545833 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.560599 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.567162 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.586080 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.626858 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.647151 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.665632 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.701884 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.707944 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.727900 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.755816 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.773537 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.787914 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.813051 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.827584 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.868293 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.870295 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.888787 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.905172 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.928653 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: E1203 14:15:39.948569 4751 projected.go:288] Couldn't get configMap openshift-machine-api/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:39 crc kubenswrapper[4751]: E1203 14:15:39.948623 4751 projected.go:194] Error preparing data for projected volume kube-api-access-fnl74 for pod openshift-machine-api/machine-api-operator-5694c8668f-2n4v9: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:39 crc kubenswrapper[4751]: E1203 14:15:39.948698 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab66f440-24ed-4244-a972-63eee27b67b1-kube-api-access-fnl74 podName:ab66f440-24ed-4244-a972-63eee27b67b1 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:40.448678612 +0000 UTC m=+147.437033829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fnl74" (UniqueName: "kubernetes.io/projected/ab66f440-24ed-4244-a972-63eee27b67b1-kube-api-access-fnl74") pod "machine-api-operator-5694c8668f-2n4v9" (UID: "ab66f440-24ed-4244-a972-63eee27b67b1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.956114 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 14:15:39 crc kubenswrapper[4751]: E1203 14:15:39.978087 4751 projected.go:288] Couldn't get configMap openshift-authentication/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:39 crc kubenswrapper[4751]: E1203 14:15:39.978143 4751 projected.go:194] Error preparing data for projected volume kube-api-access-kr7rd for pod openshift-authentication/oauth-openshift-558db77b4-zcq45: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:39 crc kubenswrapper[4751]: E1203 14:15:39.978249 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-kube-api-access-kr7rd podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:40.478217066 +0000 UTC m=+147.466572283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kr7rd" (UniqueName: "kubernetes.io/projected/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-kube-api-access-kr7rd") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.986992 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.992851 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.993677 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" event={"ID":"7821f968-1e72-4450-9a62-9f60d11121c3","Type":"ContainerStarted","Data":"6ba25b274471984a3ea7aa7738cbaee31dd44bf670665c14011afa38be1daad4"} Dec 03 14:15:39 crc kubenswrapper[4751]: I1203 14:15:39.993726 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" event={"ID":"7821f968-1e72-4450-9a62-9f60d11121c3","Type":"ContainerStarted","Data":"35228855dc63c7338c540aabd739cf2ef9f00e74ddcc78dbca8c3ba592b1eba5"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.011818 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" event={"ID":"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5","Type":"ContainerStarted","Data":"24f6e1ce374e9b9f4bdd2167b9dc0cd36e2382dc7b9902bdce2b6e971dd95848"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.012488 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" event={"ID":"2e3bbe41-d57d-447e-8fe2-b7dab3af07f5","Type":"ContainerStarted","Data":"790d32a46380ab3e444d0ef705d4d7c59b087a24c3ca93e9330a623f74fced42"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.022792 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.025578 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-c428g" event={"ID":"03d2bc09-2a61-4820-a697-7d2c5242acdb","Type":"ContainerStarted","Data":"f103dd61c9bd3146cc95442bfa2b27ee4abe39345cd2367e89a024d697c2e557"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.025640 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-c428g" event={"ID":"03d2bc09-2a61-4820-a697-7d2c5242acdb","Type":"ContainerStarted","Data":"fc65ff655a91449b41eadd7e0d4338f565282ca26c968c64378d0ded6c2b9a9b"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.025661 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.034105 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.039071 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4470769bd677b4950d55b44bb5f0ab3a0c1f84877d43d84ab05df05c107de2e5"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.048705 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.066561 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" event={"ID":"73222926-1acd-41d2-8b69-79ed24aaf6d5","Type":"ContainerStarted","Data":"5aece7bbf2cde18ea521340a38a94eaa13debe503585e0d1b3eff26a2aaa6ed8"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.066651 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" event={"ID":"73222926-1acd-41d2-8b69-79ed24aaf6d5","Type":"ContainerStarted","Data":"464ca3f6e8e0e35b975fa6aa6116ebd2ed6339405ca9520b6f5925dac2270cc8"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.069376 4751 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.073369 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"767decc7fb8b419213af1abc40b73bf27389063b08bafb1b52d94d4002a9f3e6"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.075827 4751 generic.go:334] "Generic (PLEG): container finished" podID="0ced64e5-348e-4211-bbeb-9853697b75e3" containerID="ff2d2ab532c4861fd0d21805ebe0ae5078af9efa80fe6def3b6d32c31c8b7d7e" exitCode=0 Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.075926 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" event={"ID":"0ced64e5-348e-4211-bbeb-9853697b75e3","Type":"ContainerDied","Data":"ff2d2ab532c4861fd0d21805ebe0ae5078af9efa80fe6def3b6d32c31c8b7d7e"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.075969 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" event={"ID":"0ced64e5-348e-4211-bbeb-9853697b75e3","Type":"ContainerStarted","Data":"f33b0e6c66dc8005099b551b876c37a120307c0623f9dd32ed6442d23c71a4a1"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.082847 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" event={"ID":"3e9f6fd3-3c79-48a8-a551-128f73f63dd7","Type":"ContainerStarted","Data":"e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.082925 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" event={"ID":"3e9f6fd3-3c79-48a8-a551-128f73f63dd7","Type":"ContainerStarted","Data":"6e8fb136d482a795e7e602bed6bc4a0dced61168b27a6f8fba3b021bae94b64d"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.084462 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.091827 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.092792 4751 generic.go:334] "Generic (PLEG): container finished" podID="90549328-bfce-4bd2-b1cc-651a2c9cd2e1" containerID="82d30ec2f51fa6e2d0eb171cce430951cc5b69ab54c2ca86274a34fd11b57811" exitCode=0 Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.093605 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" event={"ID":"90549328-bfce-4bd2-b1cc-651a2c9cd2e1","Type":"ContainerDied","Data":"82d30ec2f51fa6e2d0eb171cce430951cc5b69ab54c2ca86274a34fd11b57811"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.093634 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" event={"ID":"90549328-bfce-4bd2-b1cc-651a2c9cd2e1","Type":"ContainerStarted","Data":"b259835677e3fa81b89ed07ec998455146d973f56a14caf6bb66b13273315bef"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.100029 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.109287 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-djkdm" event={"ID":"a112208b-e069-48e0-8bc4-d6c4e79052fc","Type":"ContainerStarted","Data":"0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.109320 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-djkdm" event={"ID":"a112208b-e069-48e0-8bc4-d6c4e79052fc","Type":"ContainerStarted","Data":"cb27c8c07a1adac88b4af2dbc548d9f692e228595028b294d10e56f342fe57f7"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.145345 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjqsl\" (UniqueName: \"kubernetes.io/projected/09111b55-9bde-41bf-8f13-51baf359a20c-kube-api-access-zjqsl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6jx7f\" (UID: \"09111b55-9bde-41bf-8f13-51baf359a20c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.164295 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" event={"ID":"eb5bca62-c4b9-4e79-a752-96185f22b757","Type":"ContainerStarted","Data":"544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.164375 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" event={"ID":"eb5bca62-c4b9-4e79-a752-96185f22b757","Type":"ContainerStarted","Data":"34f1745ad1eba8e9f3cc55ce490ef1dd5671919d0e20f0341be439d40672dd22"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.166387 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.173265 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.179940 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-policies\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.180921 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad26e28c-197b-4671-bd99-109e1c3527d7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5dmt6\" (UID: \"ad26e28c-197b-4671-bd99-109e1c3527d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.181023 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9xcz\" (UniqueName: \"kubernetes.io/projected/a35c0587-d301-49f5-b7a2-3d7d32efed87-kube-api-access-f9xcz\") pod \"machine-config-operator-74547568cd-mvlmx\" (UID: \"a35c0587-d301-49f5-b7a2-3d7d32efed87\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.201630 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.201379 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.213552 4751 generic.go:334] "Generic (PLEG): container finished" podID="bea4ae1a-c496-47de-9d12-1d6d42793bd2" containerID="bb3f7e36004d96635eebcbd94790edfb92164fdd96f7a9bc01d458129a74c478" exitCode=0 Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.213614 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" event={"ID":"bea4ae1a-c496-47de-9d12-1d6d42793bd2","Type":"ContainerDied","Data":"bb3f7e36004d96635eebcbd94790edfb92164fdd96f7a9bc01d458129a74c478"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.213639 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" event={"ID":"bea4ae1a-c496-47de-9d12-1d6d42793bd2","Type":"ContainerStarted","Data":"cb15be09190f22d8b8241ed59d5f96e75d45c8ca7a60c1cf2fdc78015c5d0c61"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.228556 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.232166 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.232391 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.233191 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.237425 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" event={"ID":"18505c2b-ef26-4f6e-8471-21a77aab056b","Type":"ContainerStarted","Data":"c1a8c7aff5d23b9eb9edc09985d675d4c48cd7611945b7e41e77195a4e27cc0e"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.237458 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" event={"ID":"18505c2b-ef26-4f6e-8471-21a77aab056b","Type":"ContainerStarted","Data":"9fc7fc3ec3fb357b7c633259a6f022db7b5fd0252274c0b7cfaf5056b113b432"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.247639 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.251502 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mc2fd" event={"ID":"a9f280b6-c725-4857-a658-6f3073b30fdf","Type":"ContainerStarted","Data":"697dd5a09bf227056f89c84618fd0e3ba0faf989347c456d9ddc020f5f7b279f"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.251551 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mc2fd" event={"ID":"a9f280b6-c725-4857-a658-6f3073b30fdf","Type":"ContainerStarted","Data":"fbd160399b50100f203f7b65d8952c7662e403bf863233ebb99db1035812b88e"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.252289 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mc2fd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.256611 4751 patch_prober.go:28] interesting pod/downloads-7954f5f757-mc2fd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.256711 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mc2fd" podUID="a9f280b6-c725-4857-a658-6f3073b30fdf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.258159 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-images\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.264163 4751 request.go:700] Waited for 1.4805632s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.266848 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" event={"ID":"9ca258dc-b8d3-417d-9a07-9d538a778e41","Type":"ContainerStarted","Data":"e5004459eb264f5a2bfac78179c298e5b7d912b904ba5f81c1b592d05ad9ea43"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.267406 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" event={"ID":"9ca258dc-b8d3-417d-9a07-9d538a778e41","Type":"ContainerStarted","Data":"d7b60aa51fd973c77d310eadb2c4becdb5c39c7fc06ea20de4cbf492d78abf8a"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.273578 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.285953 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" event={"ID":"3af90155-5ff4-4079-896c-c6a03fcaa809","Type":"ContainerStarted","Data":"41480d7a95572822f6dcccac4d5d5e5e2a96378b9f47d4ade8529b0a5d19339a"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.285998 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" event={"ID":"3af90155-5ff4-4079-896c-c6a03fcaa809","Type":"ContainerStarted","Data":"db727dceb31a41c905217bcd7af00907adda15410411ec4c43287c28675d1c07"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.286012 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" event={"ID":"3af90155-5ff4-4079-896c-c6a03fcaa809","Type":"ContainerStarted","Data":"2a1da8318cf00737bcf45c91bcff2c8bae0f6a6b37e3d3de3ed94fee126f8bbd"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.291426 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" event={"ID":"1ba52da9-74a9-4dbc-aad3-649ecf8f67be","Type":"ContainerStarted","Data":"74dc1e390d4a00db63803b19f5b8ee9691817792c07c8e5877f3d15c4e29f68a"} Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.299275 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.305949 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.310730 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.338665 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.352545 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.359748 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.365615 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.366687 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.382639 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.392299 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab66f440-24ed-4244-a972-63eee27b67b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.403095 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.406208 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.410253 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.414870 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.435855 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.444453 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.445056 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.445953 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.446928 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-c428g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.447750 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.471558 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.475463 4751 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.475586 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.475565591 +0000 UTC m=+148.463920808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.475909 4751 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-provider-selection: failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.475940 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection podName:dd4311c3-0b8c-4ad2-8b36-1bc543c188d3 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.475931671 +0000 UTC m=+148.464286888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-provider-selection" (UniqueName: "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection") pod "oauth-openshift-558db77b4-zcq45" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3") : failed to sync secret cache: timed out waiting for the condition Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.476287 4751 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.476363 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-config podName:ab66f440-24ed-4244-a972-63eee27b67b1 nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.476345762 +0000 UTC m=+148.464701049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-config") pod "machine-api-operator-5694c8668f-2n4v9" (UID: "ab66f440-24ed-4244-a972-63eee27b67b1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.482643 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.492971 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr7rd\" (UniqueName: \"kubernetes.io/projected/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-kube-api-access-kr7rd\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.493059 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnl74\" (UniqueName: \"kubernetes.io/projected/ab66f440-24ed-4244-a972-63eee27b67b1-kube-api-access-fnl74\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.494654 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.503337 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnl74\" (UniqueName: \"kubernetes.io/projected/ab66f440-24ed-4244-a972-63eee27b67b1-kube-api-access-fnl74\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.511888 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr7rd\" (UniqueName: \"kubernetes.io/projected/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-kube-api-access-kr7rd\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.535763 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.550656 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.576605 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.602251 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704357 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b7q8\" (UniqueName: \"kubernetes.io/projected/f55e95a4-8b8f-4875-9908-33b7a94f58ea-kube-api-access-9b7q8\") pod \"machine-config-server-56xrz\" (UID: \"f55e95a4-8b8f-4875-9908-33b7a94f58ea\") " pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704413 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-trvft\" (UID: \"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704438 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043-config\") pod \"kube-controller-manager-operator-78b949d7b-trvft\" (UID: \"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704458 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b901e1a9-281f-4928-b82b-4b7334b98f4d-apiservice-cert\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704497 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fa4dfe3-601e-46c7-985c-af563252fd74-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704520 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jzk\" (UniqueName: \"kubernetes.io/projected/0fa4dfe3-601e-46c7-985c-af563252fd74-kube-api-access-v2jzk\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704554 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f55e95a4-8b8f-4875-9908-33b7a94f58ea-node-bootstrap-token\") pod \"machine-config-server-56xrz\" (UID: \"f55e95a4-8b8f-4875-9908-33b7a94f58ea\") " pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704578 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f4d710c-f625-4379-b5cf-cc5df715b6bd-config-volume\") pod \"dns-default-hvd8w\" (UID: \"6f4d710c-f625-4379-b5cf-cc5df715b6bd\") " pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704611 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-trusted-ca\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704634 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-trvft\" (UID: \"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704678 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fa4dfe3-601e-46c7-985c-af563252fd74-metrics-tls\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704753 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtgv\" (UniqueName: \"kubernetes.io/projected/ec0d089c-a47a-4ef7-b422-756a4cf8487a-kube-api-access-zvtgv\") pod \"olm-operator-6b444d44fb-9ch7z\" (UID: \"ec0d089c-a47a-4ef7-b422-756a4cf8487a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704807 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9sg\" (UniqueName: \"kubernetes.io/projected/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-kube-api-access-ms9sg\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704868 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4crv\" (UniqueName: \"kubernetes.io/projected/6f4d710c-f625-4379-b5cf-cc5df715b6bd-kube-api-access-k4crv\") pod \"dns-default-hvd8w\" (UID: \"6f4d710c-f625-4379-b5cf-cc5df715b6bd\") " pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704892 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f4d710c-f625-4379-b5cf-cc5df715b6bd-metrics-tls\") pod \"dns-default-hvd8w\" (UID: \"6f4d710c-f625-4379-b5cf-cc5df715b6bd\") " pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704916 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlg46\" (UniqueName: \"kubernetes.io/projected/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-kube-api-access-wlg46\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.704978 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-bound-sa-token\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705034 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2086abe6-48e1-4593-9789-b098b9b3142d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705059 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxmv\" (UniqueName: \"kubernetes.io/projected/89c33472-8c62-4a71-9b17-697f9a0bbc65-kube-api-access-rdxmv\") pod \"control-plane-machine-set-operator-78cbb6b69f-pppn7\" (UID: \"89c33472-8c62-4a71-9b17-697f9a0bbc65\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705094 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-etcd-ca\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705144 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/89c33472-8c62-4a71-9b17-697f9a0bbc65-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pppn7\" (UID: \"89c33472-8c62-4a71-9b17-697f9a0bbc65\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705387 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-service-ca-bundle\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705411 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-etcd-service-ca\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705431 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f55e95a4-8b8f-4875-9908-33b7a94f58ea-certs\") pod \"machine-config-server-56xrz\" (UID: \"f55e95a4-8b8f-4875-9908-33b7a94f58ea\") " pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705489 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-registry-tls\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705550 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b901e1a9-281f-4928-b82b-4b7334b98f4d-tmpfs\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705569 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-serving-cert\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705586 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-config\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705657 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705741 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b901e1a9-281f-4928-b82b-4b7334b98f4d-webhook-cert\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705771 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjbxp\" (UniqueName: \"kubernetes.io/projected/b901e1a9-281f-4928-b82b-4b7334b98f4d-kube-api-access-qjbxp\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705816 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5sjt\" (UniqueName: \"kubernetes.io/projected/f8fb24b9-e854-4b1a-8ee7-27528e3544d2-kube-api-access-n5sjt\") pod \"migrator-59844c95c7-75nhw\" (UID: \"f8fb24b9-e854-4b1a-8ee7-27528e3544d2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705839 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-registry-certificates\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705868 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-etcd-client\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705918 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6w8z\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-kube-api-access-p6w8z\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705939 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2086abe6-48e1-4593-9789-b098b9b3142d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.705991 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66-srv-cert\") pod \"catalog-operator-68c6474976-js4nc\" (UID: \"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.706015 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-default-certificate\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.706039 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66-profile-collector-cert\") pod \"catalog-operator-68c6474976-js4nc\" (UID: \"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.706062 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ec0d089c-a47a-4ef7-b422-756a4cf8487a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ch7z\" (UID: \"ec0d089c-a47a-4ef7-b422-756a4cf8487a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.706117 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-stats-auth\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.706159 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ec0d089c-a47a-4ef7-b422-756a4cf8487a-srv-cert\") pod \"olm-operator-6b444d44fb-9ch7z\" (UID: \"ec0d089c-a47a-4ef7-b422-756a4cf8487a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.706180 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj5bv\" (UniqueName: \"kubernetes.io/projected/e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66-kube-api-access-nj5bv\") pod \"catalog-operator-68c6474976-js4nc\" (UID: \"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.706239 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-metrics-certs\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.706258 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fa4dfe3-601e-46c7-985c-af563252fd74-trusted-ca\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.734710 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.234693396 +0000 UTC m=+148.223048613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.807886 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808043 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-csi-data-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808070 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f4d710c-f625-4379-b5cf-cc5df715b6bd-config-volume\") pod \"dns-default-hvd8w\" (UID: \"6f4d710c-f625-4379-b5cf-cc5df715b6bd\") " pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808086 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-trusted-ca\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808103 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwj7q\" (UniqueName: \"kubernetes.io/projected/1955fd19-879b-4d2a-bf0a-f898d93835c5-kube-api-access-kwj7q\") pod \"service-ca-operator-777779d784-xz6ms\" (UID: \"1955fd19-879b-4d2a-bf0a-f898d93835c5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808146 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-trvft\" (UID: \"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808162 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66c310ae-0242-4438-9db3-f63ffd767976-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hdhpp\" (UID: \"66c310ae-0242-4438-9db3-f63ffd767976\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808185 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fa4dfe3-601e-46c7-985c-af563252fd74-metrics-tls\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808201 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtgv\" (UniqueName: \"kubernetes.io/projected/ec0d089c-a47a-4ef7-b422-756a4cf8487a-kube-api-access-zvtgv\") pod \"olm-operator-6b444d44fb-9ch7z\" (UID: \"ec0d089c-a47a-4ef7-b422-756a4cf8487a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808215 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c310ae-0242-4438-9db3-f63ffd767976-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hdhpp\" (UID: \"66c310ae-0242-4438-9db3-f63ffd767976\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808232 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-socket-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808248 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9sg\" (UniqueName: \"kubernetes.io/projected/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-kube-api-access-ms9sg\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808269 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5m6k\" (UniqueName: \"kubernetes.io/projected/eec50b53-45d8-4fe1-b490-39be12772940-kube-api-access-l5m6k\") pod \"service-ca-9c57cc56f-4ggm5\" (UID: \"eec50b53-45d8-4fe1-b490-39be12772940\") " pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808285 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4crv\" (UniqueName: \"kubernetes.io/projected/6f4d710c-f625-4379-b5cf-cc5df715b6bd-kube-api-access-k4crv\") pod \"dns-default-hvd8w\" (UID: \"6f4d710c-f625-4379-b5cf-cc5df715b6bd\") " pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808299 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1955fd19-879b-4d2a-bf0a-f898d93835c5-serving-cert\") pod \"service-ca-operator-777779d784-xz6ms\" (UID: \"1955fd19-879b-4d2a-bf0a-f898d93835c5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808320 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f4d710c-f625-4379-b5cf-cc5df715b6bd-metrics-tls\") pod \"dns-default-hvd8w\" (UID: \"6f4d710c-f625-4379-b5cf-cc5df715b6bd\") " pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808358 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlg46\" (UniqueName: \"kubernetes.io/projected/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-kube-api-access-wlg46\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808373 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a47fb721-c3c2-4bc9-8e26-9e99eee599d1-cert\") pod \"ingress-canary-7nphl\" (UID: \"a47fb721-c3c2-4bc9-8e26-9e99eee599d1\") " pod="openshift-ingress-canary/ingress-canary-7nphl" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808390 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-bound-sa-token\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808404 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-mountpoint-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808419 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-registration-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808452 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2086abe6-48e1-4593-9789-b098b9b3142d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808469 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxmv\" (UniqueName: \"kubernetes.io/projected/89c33472-8c62-4a71-9b17-697f9a0bbc65-kube-api-access-rdxmv\") pod \"control-plane-machine-set-operator-78cbb6b69f-pppn7\" (UID: \"89c33472-8c62-4a71-9b17-697f9a0bbc65\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808485 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f799c142-06b3-4b7e-ba2a-4f11adbb175e-proxy-tls\") pod \"machine-config-controller-84d6567774-z7j5x\" (UID: \"f799c142-06b3-4b7e-ba2a-4f11adbb175e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808512 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/89c33472-8c62-4a71-9b17-697f9a0bbc65-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pppn7\" (UID: \"89c33472-8c62-4a71-9b17-697f9a0bbc65\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808531 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-etcd-ca\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808546 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-service-ca-bundle\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808561 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-etcd-service-ca\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808576 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4c725\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808593 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f55e95a4-8b8f-4875-9908-33b7a94f58ea-certs\") pod \"machine-config-server-56xrz\" (UID: \"f55e95a4-8b8f-4875-9908-33b7a94f58ea\") " pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808619 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4c725\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808640 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q787\" (UniqueName: \"kubernetes.io/projected/b9e18ab9-2082-481b-9e18-5da0f81303bf-kube-api-access-8q787\") pod \"multus-admission-controller-857f4d67dd-2jgqz\" (UID: \"b9e18ab9-2082-481b-9e18-5da0f81303bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808659 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1955fd19-879b-4d2a-bf0a-f898d93835c5-config\") pod \"service-ca-operator-777779d784-xz6ms\" (UID: \"1955fd19-879b-4d2a-bf0a-f898d93835c5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808677 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-plugins-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808699 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eec50b53-45d8-4fe1-b490-39be12772940-signing-key\") pod \"service-ca-9c57cc56f-4ggm5\" (UID: \"eec50b53-45d8-4fe1-b490-39be12772940\") " pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808737 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-registry-tls\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808762 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b901e1a9-281f-4928-b82b-4b7334b98f4d-tmpfs\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808777 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-serving-cert\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808791 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-config\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.808846 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.308819492 +0000 UTC m=+148.297174709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808942 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f799c142-06b3-4b7e-ba2a-4f11adbb175e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z7j5x\" (UID: \"f799c142-06b3-4b7e-ba2a-4f11adbb175e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.808971 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/933e4b72-77fa-463e-9828-27d6fa9e0420-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b2wgh\" (UID: \"933e4b72-77fa-463e-9828-27d6fa9e0420\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809007 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809037 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29tnz\" (UniqueName: \"kubernetes.io/projected/933e4b72-77fa-463e-9828-27d6fa9e0420-kube-api-access-29tnz\") pod \"package-server-manager-789f6589d5-b2wgh\" (UID: \"933e4b72-77fa-463e-9828-27d6fa9e0420\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809067 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6zrt\" (UniqueName: \"kubernetes.io/projected/e907ddd6-b1b8-4026-a500-9e066868ead1-kube-api-access-l6zrt\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809090 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-secret-volume\") pod \"collect-profiles-29412855-vqk8g\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809171 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b901e1a9-281f-4928-b82b-4b7334b98f4d-webhook-cert\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809193 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-config-volume\") pod \"collect-profiles-29412855-vqk8g\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809245 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjbxp\" (UniqueName: \"kubernetes.io/projected/b901e1a9-281f-4928-b82b-4b7334b98f4d-kube-api-access-qjbxp\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809266 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eec50b53-45d8-4fe1-b490-39be12772940-signing-cabundle\") pod \"service-ca-9c57cc56f-4ggm5\" (UID: \"eec50b53-45d8-4fe1-b490-39be12772940\") " pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809291 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-registry-certificates\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809314 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-etcd-client\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809351 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5sjt\" (UniqueName: \"kubernetes.io/projected/f8fb24b9-e854-4b1a-8ee7-27528e3544d2-kube-api-access-n5sjt\") pod \"migrator-59844c95c7-75nhw\" (UID: \"f8fb24b9-e854-4b1a-8ee7-27528e3544d2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809390 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6w8z\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-kube-api-access-p6w8z\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809415 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf8d7\" (UniqueName: \"kubernetes.io/projected/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-kube-api-access-rf8d7\") pod \"marketplace-operator-79b997595-4c725\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809438 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9e18ab9-2082-481b-9e18-5da0f81303bf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2jgqz\" (UID: \"b9e18ab9-2082-481b-9e18-5da0f81303bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809458 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-config\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809460 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2086abe6-48e1-4593-9789-b098b9b3142d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809523 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66-srv-cert\") pod \"catalog-operator-68c6474976-js4nc\" (UID: \"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809555 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-default-certificate\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809597 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66-profile-collector-cert\") pod \"catalog-operator-68c6474976-js4nc\" (UID: \"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809618 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ec0d089c-a47a-4ef7-b422-756a4cf8487a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ch7z\" (UID: \"ec0d089c-a47a-4ef7-b422-756a4cf8487a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809636 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c310ae-0242-4438-9db3-f63ffd767976-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hdhpp\" (UID: \"66c310ae-0242-4438-9db3-f63ffd767976\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809654 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-stats-auth\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809714 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ec0d089c-a47a-4ef7-b422-756a4cf8487a-srv-cert\") pod \"olm-operator-6b444d44fb-9ch7z\" (UID: \"ec0d089c-a47a-4ef7-b422-756a4cf8487a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809756 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj5bv\" (UniqueName: \"kubernetes.io/projected/e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66-kube-api-access-nj5bv\") pod \"catalog-operator-68c6474976-js4nc\" (UID: \"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809787 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-metrics-certs\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809801 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fa4dfe3-601e-46c7-985c-af563252fd74-trusted-ca\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809819 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpslh\" (UniqueName: \"kubernetes.io/projected/a47fb721-c3c2-4bc9-8e26-9e99eee599d1-kube-api-access-zpslh\") pod \"ingress-canary-7nphl\" (UID: \"a47fb721-c3c2-4bc9-8e26-9e99eee599d1\") " pod="openshift-ingress-canary/ingress-canary-7nphl" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809837 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b7q8\" (UniqueName: \"kubernetes.io/projected/f55e95a4-8b8f-4875-9908-33b7a94f58ea-kube-api-access-9b7q8\") pod \"machine-config-server-56xrz\" (UID: \"f55e95a4-8b8f-4875-9908-33b7a94f58ea\") " pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809885 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-trvft\" (UID: \"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809915 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043-config\") pod \"kube-controller-manager-operator-78b949d7b-trvft\" (UID: \"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809931 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b901e1a9-281f-4928-b82b-4b7334b98f4d-apiservice-cert\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809955 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f55e95a4-8b8f-4875-9908-33b7a94f58ea-node-bootstrap-token\") pod \"machine-config-server-56xrz\" (UID: \"f55e95a4-8b8f-4875-9908-33b7a94f58ea\") " pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809972 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fa4dfe3-601e-46c7-985c-af563252fd74-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.809988 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzk\" (UniqueName: \"kubernetes.io/projected/0fa4dfe3-601e-46c7-985c-af563252fd74-kube-api-access-v2jzk\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.810004 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl4fx\" (UniqueName: \"kubernetes.io/projected/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-kube-api-access-jl4fx\") pod \"collect-profiles-29412855-vqk8g\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.810020 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvtzv\" (UniqueName: \"kubernetes.io/projected/f799c142-06b3-4b7e-ba2a-4f11adbb175e-kube-api-access-nvtzv\") pod \"machine-config-controller-84d6567774-z7j5x\" (UID: \"f799c142-06b3-4b7e-ba2a-4f11adbb175e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.810071 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f4d710c-f625-4379-b5cf-cc5df715b6bd-config-volume\") pod \"dns-default-hvd8w\" (UID: \"6f4d710c-f625-4379-b5cf-cc5df715b6bd\") " pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.811515 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.311501273 +0000 UTC m=+148.299856550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.818776 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-etcd-service-ca\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.819474 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-trusted-ca\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.819663 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2086abe6-48e1-4593-9789-b098b9b3142d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.820460 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-registry-certificates\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.823216 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043-config\") pod \"kube-controller-manager-operator-78b949d7b-trvft\" (UID: \"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.824595 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-etcd-client\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.824948 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b901e1a9-281f-4928-b82b-4b7334b98f4d-tmpfs\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.825465 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2086abe6-48e1-4593-9789-b098b9b3142d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.828583 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-etcd-ca\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.829254 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-service-ca-bundle\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.830035 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66-profile-collector-cert\") pod \"catalog-operator-68c6474976-js4nc\" (UID: \"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.847235 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fa4dfe3-601e-46c7-985c-af563252fd74-metrics-tls\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.853533 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b901e1a9-281f-4928-b82b-4b7334b98f4d-webhook-cert\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.853979 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ec0d089c-a47a-4ef7-b422-756a4cf8487a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ch7z\" (UID: \"ec0d089c-a47a-4ef7-b422-756a4cf8487a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.855802 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f55e95a4-8b8f-4875-9908-33b7a94f58ea-certs\") pod \"machine-config-server-56xrz\" (UID: \"f55e95a4-8b8f-4875-9908-33b7a94f58ea\") " pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.856167 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66-srv-cert\") pod \"catalog-operator-68c6474976-js4nc\" (UID: \"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.857677 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-registry-tls\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.858745 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6"] Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.858779 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/89c33472-8c62-4a71-9b17-697f9a0bbc65-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pppn7\" (UID: \"89c33472-8c62-4a71-9b17-697f9a0bbc65\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.860628 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fa4dfe3-601e-46c7-985c-af563252fd74-trusted-ca\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.862133 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-serving-cert\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.863599 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ec0d089c-a47a-4ef7-b422-756a4cf8487a-srv-cert\") pod \"olm-operator-6b444d44fb-9ch7z\" (UID: \"ec0d089c-a47a-4ef7-b422-756a4cf8487a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.866723 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b901e1a9-281f-4928-b82b-4b7334b98f4d-apiservice-cert\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.869670 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f4d710c-f625-4379-b5cf-cc5df715b6bd-metrics-tls\") pod \"dns-default-hvd8w\" (UID: \"6f4d710c-f625-4379-b5cf-cc5df715b6bd\") " pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.870639 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f55e95a4-8b8f-4875-9908-33b7a94f58ea-node-bootstrap-token\") pod \"machine-config-server-56xrz\" (UID: \"f55e95a4-8b8f-4875-9908-33b7a94f58ea\") " pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.874118 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-trvft\" (UID: \"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.874185 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4crv\" (UniqueName: \"kubernetes.io/projected/6f4d710c-f625-4379-b5cf-cc5df715b6bd-kube-api-access-k4crv\") pod \"dns-default-hvd8w\" (UID: \"6f4d710c-f625-4379-b5cf-cc5df715b6bd\") " pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.874507 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-metrics-certs\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.878737 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-default-certificate\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.879128 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-stats-auth\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.898098 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-trvft\" (UID: \"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.898227 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9sg\" (UniqueName: \"kubernetes.io/projected/105f1cdb-fae2-4fcb-8548-ae90bdcbb75f-kube-api-access-ms9sg\") pod \"etcd-operator-b45778765-5kgjd\" (UID: \"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.900147 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6w8z\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-kube-api-access-p6w8z\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.907503 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.910400 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.910610 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f"] Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.910849 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.410827509 +0000 UTC m=+148.399182726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.910921 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f799c142-06b3-4b7e-ba2a-4f11adbb175e-proxy-tls\") pod \"machine-config-controller-84d6567774-z7j5x\" (UID: \"f799c142-06b3-4b7e-ba2a-4f11adbb175e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.910967 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4c725\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.910991 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1955fd19-879b-4d2a-bf0a-f898d93835c5-config\") pod \"service-ca-operator-777779d784-xz6ms\" (UID: \"1955fd19-879b-4d2a-bf0a-f898d93835c5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911012 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-plugins-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911032 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eec50b53-45d8-4fe1-b490-39be12772940-signing-key\") pod \"service-ca-9c57cc56f-4ggm5\" (UID: \"eec50b53-45d8-4fe1-b490-39be12772940\") " pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911066 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4c725\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911088 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q787\" (UniqueName: \"kubernetes.io/projected/b9e18ab9-2082-481b-9e18-5da0f81303bf-kube-api-access-8q787\") pod \"multus-admission-controller-857f4d67dd-2jgqz\" (UID: \"b9e18ab9-2082-481b-9e18-5da0f81303bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911135 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f799c142-06b3-4b7e-ba2a-4f11adbb175e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z7j5x\" (UID: \"f799c142-06b3-4b7e-ba2a-4f11adbb175e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911156 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/933e4b72-77fa-463e-9828-27d6fa9e0420-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b2wgh\" (UID: \"933e4b72-77fa-463e-9828-27d6fa9e0420\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911181 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911202 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29tnz\" (UniqueName: \"kubernetes.io/projected/933e4b72-77fa-463e-9828-27d6fa9e0420-kube-api-access-29tnz\") pod \"package-server-manager-789f6589d5-b2wgh\" (UID: \"933e4b72-77fa-463e-9828-27d6fa9e0420\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911230 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6zrt\" (UniqueName: \"kubernetes.io/projected/e907ddd6-b1b8-4026-a500-9e066868ead1-kube-api-access-l6zrt\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911254 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-secret-volume\") pod \"collect-profiles-29412855-vqk8g\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911286 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-config-volume\") pod \"collect-profiles-29412855-vqk8g\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911348 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eec50b53-45d8-4fe1-b490-39be12772940-signing-cabundle\") pod \"service-ca-9c57cc56f-4ggm5\" (UID: \"eec50b53-45d8-4fe1-b490-39be12772940\") " pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911420 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf8d7\" (UniqueName: \"kubernetes.io/projected/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-kube-api-access-rf8d7\") pod \"marketplace-operator-79b997595-4c725\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911444 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9e18ab9-2082-481b-9e18-5da0f81303bf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2jgqz\" (UID: \"b9e18ab9-2082-481b-9e18-5da0f81303bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911482 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c310ae-0242-4438-9db3-f63ffd767976-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hdhpp\" (UID: \"66c310ae-0242-4438-9db3-f63ffd767976\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911529 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpslh\" (UniqueName: \"kubernetes.io/projected/a47fb721-c3c2-4bc9-8e26-9e99eee599d1-kube-api-access-zpslh\") pod \"ingress-canary-7nphl\" (UID: \"a47fb721-c3c2-4bc9-8e26-9e99eee599d1\") " pod="openshift-ingress-canary/ingress-canary-7nphl" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911573 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl4fx\" (UniqueName: \"kubernetes.io/projected/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-kube-api-access-jl4fx\") pod \"collect-profiles-29412855-vqk8g\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911599 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvtzv\" (UniqueName: \"kubernetes.io/projected/f799c142-06b3-4b7e-ba2a-4f11adbb175e-kube-api-access-nvtzv\") pod \"machine-config-controller-84d6567774-z7j5x\" (UID: \"f799c142-06b3-4b7e-ba2a-4f11adbb175e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911628 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-csi-data-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911652 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwj7q\" (UniqueName: \"kubernetes.io/projected/1955fd19-879b-4d2a-bf0a-f898d93835c5-kube-api-access-kwj7q\") pod \"service-ca-operator-777779d784-xz6ms\" (UID: \"1955fd19-879b-4d2a-bf0a-f898d93835c5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911678 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66c310ae-0242-4438-9db3-f63ffd767976-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hdhpp\" (UID: \"66c310ae-0242-4438-9db3-f63ffd767976\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911716 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c310ae-0242-4438-9db3-f63ffd767976-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hdhpp\" (UID: \"66c310ae-0242-4438-9db3-f63ffd767976\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911740 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-socket-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911771 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5m6k\" (UniqueName: \"kubernetes.io/projected/eec50b53-45d8-4fe1-b490-39be12772940-kube-api-access-l5m6k\") pod \"service-ca-9c57cc56f-4ggm5\" (UID: \"eec50b53-45d8-4fe1-b490-39be12772940\") " pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911802 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1955fd19-879b-4d2a-bf0a-f898d93835c5-serving-cert\") pod \"service-ca-operator-777779d784-xz6ms\" (UID: \"1955fd19-879b-4d2a-bf0a-f898d93835c5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911845 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a47fb721-c3c2-4bc9-8e26-9e99eee599d1-cert\") pod \"ingress-canary-7nphl\" (UID: \"a47fb721-c3c2-4bc9-8e26-9e99eee599d1\") " pod="openshift-ingress-canary/ingress-canary-7nphl" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911887 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-mountpoint-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.911909 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-registration-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.912144 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-plugins-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.912177 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f799c142-06b3-4b7e-ba2a-4f11adbb175e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z7j5x\" (UID: \"f799c142-06b3-4b7e-ba2a-4f11adbb175e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.912183 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-registration-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: E1203 14:15:40.915993 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.415975955 +0000 UTC m=+148.404331172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.916201 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-csi-data-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.916223 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-socket-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.916946 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c310ae-0242-4438-9db3-f63ffd767976-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hdhpp\" (UID: \"66c310ae-0242-4438-9db3-f63ffd767976\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.917160 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e907ddd6-b1b8-4026-a500-9e066868ead1-mountpoint-dir\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.917689 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eec50b53-45d8-4fe1-b490-39be12772940-signing-key\") pod \"service-ca-9c57cc56f-4ggm5\" (UID: \"eec50b53-45d8-4fe1-b490-39be12772940\") " pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.919403 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eec50b53-45d8-4fe1-b490-39be12772940-signing-cabundle\") pod \"service-ca-9c57cc56f-4ggm5\" (UID: \"eec50b53-45d8-4fe1-b490-39be12772940\") " pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.920087 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-config-volume\") pod \"collect-profiles-29412855-vqk8g\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.921097 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1955fd19-879b-4d2a-bf0a-f898d93835c5-config\") pod \"service-ca-operator-777779d784-xz6ms\" (UID: \"1955fd19-879b-4d2a-bf0a-f898d93835c5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.923556 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-secret-volume\") pod \"collect-profiles-29412855-vqk8g\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.924003 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4c725\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.926460 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.927882 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1955fd19-879b-4d2a-bf0a-f898d93835c5-serving-cert\") pod \"service-ca-operator-777779d784-xz6ms\" (UID: \"1955fd19-879b-4d2a-bf0a-f898d93835c5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.936844 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9e18ab9-2082-481b-9e18-5da0f81303bf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2jgqz\" (UID: \"b9e18ab9-2082-481b-9e18-5da0f81303bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.936916 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4c725\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.937057 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a47fb721-c3c2-4bc9-8e26-9e99eee599d1-cert\") pod \"ingress-canary-7nphl\" (UID: \"a47fb721-c3c2-4bc9-8e26-9e99eee599d1\") " pod="openshift-ingress-canary/ingress-canary-7nphl" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.937779 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c310ae-0242-4438-9db3-f63ffd767976-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hdhpp\" (UID: \"66c310ae-0242-4438-9db3-f63ffd767976\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.938205 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/933e4b72-77fa-463e-9828-27d6fa9e0420-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b2wgh\" (UID: \"933e4b72-77fa-463e-9828-27d6fa9e0420\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.943320 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5sjt\" (UniqueName: \"kubernetes.io/projected/f8fb24b9-e854-4b1a-8ee7-27528e3544d2-kube-api-access-n5sjt\") pod \"migrator-59844c95c7-75nhw\" (UID: \"f8fb24b9-e854-4b1a-8ee7-27528e3544d2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.956538 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtgv\" (UniqueName: \"kubernetes.io/projected/ec0d089c-a47a-4ef7-b422-756a4cf8487a-kube-api-access-zvtgv\") pod \"olm-operator-6b444d44fb-9ch7z\" (UID: \"ec0d089c-a47a-4ef7-b422-756a4cf8487a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:40 crc kubenswrapper[4751]: W1203 14:15:40.973941 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09111b55_9bde_41bf_8f13_51baf359a20c.slice/crio-261f17eefda1ed31d2016a969321c2f75212192fc923100bf2ef50107f47ed8b WatchSource:0}: Error finding container 261f17eefda1ed31d2016a969321c2f75212192fc923100bf2ef50107f47ed8b: Status 404 returned error can't find the container with id 261f17eefda1ed31d2016a969321c2f75212192fc923100bf2ef50107f47ed8b Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.974817 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjbxp\" (UniqueName: \"kubernetes.io/projected/b901e1a9-281f-4928-b82b-4b7334b98f4d-kube-api-access-qjbxp\") pod \"packageserver-d55dfcdfc-l7qh4\" (UID: \"b901e1a9-281f-4928-b82b-4b7334b98f4d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.976517 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f799c142-06b3-4b7e-ba2a-4f11adbb175e-proxy-tls\") pod \"machine-config-controller-84d6567774-z7j5x\" (UID: \"f799c142-06b3-4b7e-ba2a-4f11adbb175e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:40 crc kubenswrapper[4751]: I1203 14:15:40.994396 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlg46\" (UniqueName: \"kubernetes.io/projected/c4d1b134-55b3-4b2e-92da-b8c5416c13a5-kube-api-access-wlg46\") pod \"router-default-5444994796-qjtlv\" (UID: \"c4d1b134-55b3-4b2e-92da-b8c5416c13a5\") " pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:40.999095 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.006286 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.012798 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.013016 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.512969689 +0000 UTC m=+148.501324906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.013250 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.013807 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-bound-sa-token\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.014088 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.514079478 +0000 UTC m=+148.502434895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.014441 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxmv\" (UniqueName: \"kubernetes.io/projected/89c33472-8c62-4a71-9b17-697f9a0bbc65-kube-api-access-rdxmv\") pod \"control-plane-machine-set-operator-78cbb6b69f-pppn7\" (UID: \"89c33472-8c62-4a71-9b17-697f9a0bbc65\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.015812 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.022991 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.026983 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fa4dfe3-601e-46c7-985c-af563252fd74-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.056547 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx"] Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.069085 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jzk\" (UniqueName: \"kubernetes.io/projected/0fa4dfe3-601e-46c7-985c-af563252fd74-kube-api-access-v2jzk\") pod \"ingress-operator-5b745b69d9-4qthd\" (UID: \"0fa4dfe3-601e-46c7-985c-af563252fd74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.080339 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj5bv\" (UniqueName: \"kubernetes.io/projected/e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66-kube-api-access-nj5bv\") pod \"catalog-operator-68c6474976-js4nc\" (UID: \"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.115818 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.116528 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.616506965 +0000 UTC m=+148.604862182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.152247 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b7q8\" (UniqueName: \"kubernetes.io/projected/f55e95a4-8b8f-4875-9908-33b7a94f58ea-kube-api-access-9b7q8\") pod \"machine-config-server-56xrz\" (UID: \"f55e95a4-8b8f-4875-9908-33b7a94f58ea\") " pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.152625 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q787\" (UniqueName: \"kubernetes.io/projected/b9e18ab9-2082-481b-9e18-5da0f81303bf-kube-api-access-8q787\") pod \"multus-admission-controller-857f4d67dd-2jgqz\" (UID: \"b9e18ab9-2082-481b-9e18-5da0f81303bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.169508 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29tnz\" (UniqueName: \"kubernetes.io/projected/933e4b72-77fa-463e-9828-27d6fa9e0420-kube-api-access-29tnz\") pod \"package-server-manager-789f6589d5-b2wgh\" (UID: \"933e4b72-77fa-463e-9828-27d6fa9e0420\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.187357 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.201162 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.219086 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.219428 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.719406925 +0000 UTC m=+148.707762142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.220375 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5m6k\" (UniqueName: \"kubernetes.io/projected/eec50b53-45d8-4fe1-b490-39be12772940-kube-api-access-l5m6k\") pod \"service-ca-9c57cc56f-4ggm5\" (UID: \"eec50b53-45d8-4fe1-b490-39be12772940\") " pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.224318 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvtzv\" (UniqueName: \"kubernetes.io/projected/f799c142-06b3-4b7e-ba2a-4f11adbb175e-kube-api-access-nvtzv\") pod \"machine-config-controller-84d6567774-z7j5x\" (UID: \"f799c142-06b3-4b7e-ba2a-4f11adbb175e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.246695 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.278931 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6zrt\" (UniqueName: \"kubernetes.io/projected/e907ddd6-b1b8-4026-a500-9e066868ead1-kube-api-access-l6zrt\") pod \"csi-hostpathplugin-7clk4\" (UID: \"e907ddd6-b1b8-4026-a500-9e066868ead1\") " pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.280615 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwj7q\" (UniqueName: \"kubernetes.io/projected/1955fd19-879b-4d2a-bf0a-f898d93835c5-kube-api-access-kwj7q\") pod \"service-ca-operator-777779d784-xz6ms\" (UID: \"1955fd19-879b-4d2a-bf0a-f898d93835c5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.281911 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66c310ae-0242-4438-9db3-f63ffd767976-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hdhpp\" (UID: \"66c310ae-0242-4438-9db3-f63ffd767976\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.288740 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.309679 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf8d7\" (UniqueName: \"kubernetes.io/projected/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-kube-api-access-rf8d7\") pod \"marketplace-operator-79b997595-4c725\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.321353 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.321815 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.821799812 +0000 UTC m=+148.810155029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.332645 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-56xrz" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.375311 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl4fx\" (UniqueName: \"kubernetes.io/projected/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-kube-api-access-jl4fx\") pod \"collect-profiles-29412855-vqk8g\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.379707 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.379795 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.380273 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.380579 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.408349 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.410315 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.411682 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.423607 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.423988 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:41.923970593 +0000 UTC m=+148.912325810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.429785 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpslh\" (UniqueName: \"kubernetes.io/projected/a47fb721-c3c2-4bc9-8e26-9e99eee599d1-kube-api-access-zpslh\") pod \"ingress-canary-7nphl\" (UID: \"a47fb721-c3c2-4bc9-8e26-9e99eee599d1\") " pod="openshift-ingress-canary/ingress-canary-7nphl" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.454482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" event={"ID":"0ced64e5-348e-4211-bbeb-9853697b75e3","Type":"ContainerStarted","Data":"3787d82cbfb4abb3c8384102658c7e22d7eb8345e1a8781b5c0adb0004ba7e5e"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.488395 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.489269 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7clk4" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.512019 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" event={"ID":"ad26e28c-197b-4671-bd99-109e1c3527d7","Type":"ContainerStarted","Data":"dec68708d2afb639cdefc8c27af81d1f9c1657eb158cb0b7d6c7a65689ba884b"} Dec 03 14:15:41 crc kubenswrapper[4751]: W1203 14:15:41.523258 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4d1b134_55b3_4b2e_92da_b8c5416c13a5.slice/crio-5948f049f8efdc37af7d630abb165618915199e17bf602e95434a33021725eae WatchSource:0}: Error finding container 5948f049f8efdc37af7d630abb165618915199e17bf602e95434a33021725eae: Status 404 returned error can't find the container with id 5948f049f8efdc37af7d630abb165618915199e17bf602e95434a33021725eae Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.524969 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.525138 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-config\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.525178 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.525202 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.525976 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.025959628 +0000 UTC m=+149.014314845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.526724 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab66f440-24ed-4244-a972-63eee27b67b1-config\") pod \"machine-api-operator-5694c8668f-2n4v9\" (UID: \"ab66f440-24ed-4244-a972-63eee27b67b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.529677 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.538657 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zcq45\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.579652 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z"] Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.590695 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.600682 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"441f7a8d936200386480232dd5937760492846f9ef9b437f3a6ad6cfddb363e4"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.601524 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.614484 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" event={"ID":"bea4ae1a-c496-47de-9d12-1d6d42793bd2","Type":"ContainerStarted","Data":"add2d87261394fe40eb6199a18eb43c072f085c68a5ecd28578fab60460b42ef"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.615253 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.623811 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" event={"ID":"90549328-bfce-4bd2-b1cc-651a2c9cd2e1","Type":"ContainerStarted","Data":"5fd9be18d6ad83c7cd4ac9cf5e6c62deb9fc77e7b384f14f0ed6bfed0790c60c"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.627442 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.628738 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.629022 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.129011772 +0000 UTC m=+149.117366989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.638405 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7828bf0f128d5178b9e7541adf16131c87044bcbb7569eb51c5a1128fe3353a0"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.638448 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b2f465de5a911fc982e53e9baa02b2300d4cdc0948480297a04dc5b7fe9455b9"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.683725 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7nphl" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.685715 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" event={"ID":"73222926-1acd-41d2-8b69-79ed24aaf6d5","Type":"ContainerStarted","Data":"73bd02e5e91efab17755a824160a6cb35e6f41b210d0c0e6c5c3f6ec64c32798"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.695258 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5kgjd"] Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.704693 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" event={"ID":"a35c0587-d301-49f5-b7a2-3d7d32efed87","Type":"ContainerStarted","Data":"f4e249f520640e9daff4d768bf8615858c5feeefbc902ab8b51b7d219f908263"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.731577 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" event={"ID":"09111b55-9bde-41bf-8f13-51baf359a20c","Type":"ContainerStarted","Data":"261f17eefda1ed31d2016a969321c2f75212192fc923100bf2ef50107f47ed8b"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.732714 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.733872 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.233853474 +0000 UTC m=+149.222208691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.768883 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3515339650c528a16ef3819b45f4c24cd1bb0c8e05d56fafa47ac240b8fb2853"} Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.782805 4751 patch_prober.go:28] interesting pod/downloads-7954f5f757-mc2fd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.782848 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mc2fd" podUID="a9f280b6-c725-4857-a658-6f3073b30fdf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.814306 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hvd8w"] Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.842278 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.844773 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.344755616 +0000 UTC m=+149.333110823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.917519 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4"] Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.945794 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.945952 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.445921029 +0000 UTC m=+149.434276246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.950639 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:41 crc kubenswrapper[4751]: E1203 14:15:41.953789 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.453762607 +0000 UTC m=+149.442117824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:41 crc kubenswrapper[4751]: I1203 14:15:41.977115 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw"] Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.051958 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.081403 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.581373293 +0000 UTC m=+149.569728510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: W1203 14:15:42.144973 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb901e1a9_281f_4928_b82b_4b7334b98f4d.slice/crio-429932a5285d182120fc72075fddda0960694fc46172792e6ffc65fd6f5755f7 WatchSource:0}: Error finding container 429932a5285d182120fc72075fddda0960694fc46172792e6ffc65fd6f5755f7: Status 404 returned error can't find the container with id 429932a5285d182120fc72075fddda0960694fc46172792e6ffc65fd6f5755f7 Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.158069 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.158394 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.658381496 +0000 UTC m=+149.646736713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.224074 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd"] Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.253392 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft"] Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.261339 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.261749 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.761723537 +0000 UTC m=+149.750078754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.362584 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.362934 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.862922382 +0000 UTC m=+149.851277599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.418714 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7"] Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.463864 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.463997 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.963968053 +0000 UTC m=+149.952323280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.464231 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.465820 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:42.965804852 +0000 UTC m=+149.954160109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: W1203 14:15:42.516521 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89c33472_8c62_4a71_9b17_697f9a0bbc65.slice/crio-0facb475422158ca67c7d052fb2646ca3557c5fcf4a892cb5d42f6235119cded WatchSource:0}: Error finding container 0facb475422158ca67c7d052fb2646ca3557c5fcf4a892cb5d42f6235119cded: Status 404 returned error can't find the container with id 0facb475422158ca67c7d052fb2646ca3557c5fcf4a892cb5d42f6235119cded Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.566115 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.566526 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:43.066507444 +0000 UTC m=+150.054862661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.677167 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.677597 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:43.17758128 +0000 UTC m=+150.165936497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.750401 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp"] Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.770847 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2b78" podStartSLOduration=127.770819544 podStartE2EDuration="2m7.770819544s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:42.770743652 +0000 UTC m=+149.759098879" watchObservedRunningTime="2025-12-03 14:15:42.770819544 +0000 UTC m=+149.759174761" Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.778179 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.778630 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:43.278613051 +0000 UTC m=+150.266968258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.801301 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" event={"ID":"0fa4dfe3-601e-46c7-985c-af563252fd74","Type":"ContainerStarted","Data":"1677ff2081e65ed0e2910f560a6b93b5ee050b6ee6eb1e3f6ebc7e1a3d006df9"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.817215 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-c428g" podStartSLOduration=126.817197435 podStartE2EDuration="2m6.817197435s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:42.81477302 +0000 UTC m=+149.803128257" watchObservedRunningTime="2025-12-03 14:15:42.817197435 +0000 UTC m=+149.805552652" Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.834605 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" event={"ID":"09111b55-9bde-41bf-8f13-51baf359a20c","Type":"ContainerStarted","Data":"df49e795f33217b164b11ffc254672a87a779a060af782b27e440f97302a0e51"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.847674 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-56xrz" event={"ID":"f55e95a4-8b8f-4875-9908-33b7a94f58ea","Type":"ContainerStarted","Data":"ae47362fe1f912049d96ffd0841fa3bef200ca816457f5d7e362c2223f9871b7"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.851550 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" podStartSLOduration=126.851517045 podStartE2EDuration="2m6.851517045s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:42.849567213 +0000 UTC m=+149.837922430" watchObservedRunningTime="2025-12-03 14:15:42.851517045 +0000 UTC m=+149.839872282" Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.853945 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" event={"ID":"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f","Type":"ContainerStarted","Data":"9fd123210efb059b3261103187d3c79a3334862cac5d42ce7fdc491d2185e640"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.860169 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hvd8w" event={"ID":"6f4d710c-f625-4379-b5cf-cc5df715b6bd","Type":"ContainerStarted","Data":"f79f30e11350d2ab4473c1753c027eccff58354367a656d9e7db1afcd7408c39"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.880391 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.881252 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:43.381229213 +0000 UTC m=+150.369584440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.897962 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" event={"ID":"90549328-bfce-4bd2-b1cc-651a2c9cd2e1","Type":"ContainerStarted","Data":"ec9e179d5376f301d0f59c074d4988b4c8139fa2cd390ce0564fa4ecdfdfa6c0"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.902906 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qjtlv" event={"ID":"c4d1b134-55b3-4b2e-92da-b8c5416c13a5","Type":"ContainerStarted","Data":"78a76ed3804cd5bf4ec590804b77340fa4ec96c209f4c1de3c56cfdd4dd2b74d"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.902952 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qjtlv" event={"ID":"c4d1b134-55b3-4b2e-92da-b8c5416c13a5","Type":"ContainerStarted","Data":"5948f049f8efdc37af7d630abb165618915199e17bf602e95434a33021725eae"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.904709 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" event={"ID":"89c33472-8c62-4a71-9b17-697f9a0bbc65","Type":"ContainerStarted","Data":"0facb475422158ca67c7d052fb2646ca3557c5fcf4a892cb5d42f6235119cded"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.917026 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" event={"ID":"ec0d089c-a47a-4ef7-b422-756a4cf8487a","Type":"ContainerStarted","Data":"63c3e6161f92b8053fbce7629c8a2fc8bad546a0f328cdafff649126afce68d9"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.920738 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" event={"ID":"ad26e28c-197b-4671-bd99-109e1c3527d7","Type":"ContainerStarted","Data":"0cef13c2d43c8c3689a1526ea70e1e1c2767d9c1a291f4557139e2367ce1bb86"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.925846 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw" event={"ID":"f8fb24b9-e854-4b1a-8ee7-27528e3544d2","Type":"ContainerStarted","Data":"4822ec14aecb197913e7d1864567a408ae93c6e96507e2dc1e866932bef06b86"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.928810 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b96kv" podStartSLOduration=126.928790485 podStartE2EDuration="2m6.928790485s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:42.92406503 +0000 UTC m=+149.912420247" watchObservedRunningTime="2025-12-03 14:15:42.928790485 +0000 UTC m=+149.917145702" Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.952306 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" event={"ID":"b901e1a9-281f-4928-b82b-4b7334b98f4d","Type":"ContainerStarted","Data":"429932a5285d182120fc72075fddda0960694fc46172792e6ffc65fd6f5755f7"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.974032 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mc2fd" podStartSLOduration=126.974012605 podStartE2EDuration="2m6.974012605s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:42.973807109 +0000 UTC m=+149.962162326" watchObservedRunningTime="2025-12-03 14:15:42.974012605 +0000 UTC m=+149.962367842" Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.976906 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" event={"ID":"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043","Type":"ContainerStarted","Data":"ff41d4a825a7e464dfe5f1c3d485887f44565c30acba1025cabf84b4a5edd466"} Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.980379 4751 patch_prober.go:28] interesting pod/downloads-7954f5f757-mc2fd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.980428 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mc2fd" podUID="a9f280b6-c725-4857-a658-6f3073b30fdf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 03 14:15:42 crc kubenswrapper[4751]: I1203 14:15:42.981572 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:42 crc kubenswrapper[4751]: E1203 14:15:42.982750 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:43.482736316 +0000 UTC m=+150.471091533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.025589 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.063927 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" podStartSLOduration=127.06391372 podStartE2EDuration="2m7.06391372s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.023705313 +0000 UTC m=+150.012060530" watchObservedRunningTime="2025-12-03 14:15:43.06391372 +0000 UTC m=+150.052268937" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.064932 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zzcqh" podStartSLOduration=127.064925747 podStartE2EDuration="2m7.064925747s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.056772381 +0000 UTC m=+150.045127598" watchObservedRunningTime="2025-12-03 14:15:43.064925747 +0000 UTC m=+150.053280964" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.074999 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.075467 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.084057 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:43 crc kubenswrapper[4751]: E1203 14:15:43.085837 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:43.585821441 +0000 UTC m=+150.574176658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.097069 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.098725 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pnmnf" podStartSLOduration=127.098705833 podStartE2EDuration="2m7.098705833s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.096763042 +0000 UTC m=+150.085118269" watchObservedRunningTime="2025-12-03 14:15:43.098705833 +0000 UTC m=+150.087061050" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.137480 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.137528 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.153014 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wshkr" podStartSLOduration=128.152994433 podStartE2EDuration="2m8.152994433s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.132107099 +0000 UTC m=+150.120462316" watchObservedRunningTime="2025-12-03 14:15:43.152994433 +0000 UTC m=+150.141349650" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.155097 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.155232 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.185891 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:43 crc kubenswrapper[4751]: E1203 14:15:43.186727 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:43.686707728 +0000 UTC m=+150.675062955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.240163 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" podStartSLOduration=127.240144045 podStartE2EDuration="2m7.240144045s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.227728406 +0000 UTC m=+150.216083623" watchObservedRunningTime="2025-12-03 14:15:43.240144045 +0000 UTC m=+150.228499262" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.269106 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" podStartSLOduration=127.269083163 podStartE2EDuration="2m7.269083163s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.267359487 +0000 UTC m=+150.255714704" watchObservedRunningTime="2025-12-03 14:15:43.269083163 +0000 UTC m=+150.257438380" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.293821 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:43 crc kubenswrapper[4751]: E1203 14:15:43.294224 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:43.79420946 +0000 UTC m=+150.782564677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.297821 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-6zks7" podStartSLOduration=128.297782265 podStartE2EDuration="2m8.297782265s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.295932956 +0000 UTC m=+150.284288173" watchObservedRunningTime="2025-12-03 14:15:43.297782265 +0000 UTC m=+150.286137482" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.341066 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-djkdm" podStartSLOduration=127.341051783 podStartE2EDuration="2m7.341051783s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.3379358 +0000 UTC m=+150.326291017" watchObservedRunningTime="2025-12-03 14:15:43.341051783 +0000 UTC m=+150.329407000" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.392445 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh"] Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.395406 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:43 crc kubenswrapper[4751]: E1203 14:15:43.405280 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:43.905248536 +0000 UTC m=+150.893603753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.408858 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-czhpn" podStartSLOduration=127.408839001 podStartE2EDuration="2m7.408839001s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.405884243 +0000 UTC m=+150.394239460" watchObservedRunningTime="2025-12-03 14:15:43.408839001 +0000 UTC m=+150.397194208" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.441106 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" podStartSLOduration=128.441077576 podStartE2EDuration="2m8.441077576s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.440170032 +0000 UTC m=+150.428525269" watchObservedRunningTime="2025-12-03 14:15:43.441077576 +0000 UTC m=+150.429432793" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.504794 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:43 crc kubenswrapper[4751]: E1203 14:15:43.505054 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.005042693 +0000 UTC m=+150.993397910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.561396 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qjtlv" podStartSLOduration=127.561379488 podStartE2EDuration="2m7.561379488s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.490383544 +0000 UTC m=+150.478738761" watchObservedRunningTime="2025-12-03 14:15:43.561379488 +0000 UTC m=+150.549734705" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.606067 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:43 crc kubenswrapper[4751]: E1203 14:15:43.607281 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.107262585 +0000 UTC m=+151.095617802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.608260 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5dmt6" podStartSLOduration=127.608236751 podStartE2EDuration="2m7.608236751s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.566310879 +0000 UTC m=+150.554666116" watchObservedRunningTime="2025-12-03 14:15:43.608236751 +0000 UTC m=+150.596591968" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.661216 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6jx7f" podStartSLOduration=127.661196646 podStartE2EDuration="2m7.661196646s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:43.659646965 +0000 UTC m=+150.648002192" watchObservedRunningTime="2025-12-03 14:15:43.661196646 +0000 UTC m=+150.649551863" Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.709354 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:43 crc kubenswrapper[4751]: E1203 14:15:43.710050 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.210035292 +0000 UTC m=+151.198390509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.812866 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:43 crc kubenswrapper[4751]: E1203 14:15:43.813209 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.313193179 +0000 UTC m=+151.301548396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.897784 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c725"] Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.914812 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:43 crc kubenswrapper[4751]: E1203 14:15:43.915121 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.415108653 +0000 UTC m=+151.403463870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.918733 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zcq45"] Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.951160 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7nphl"] Dec 03 14:15:43 crc kubenswrapper[4751]: I1203 14:15:43.975503 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g"] Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.005507 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2jgqz"] Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.006266 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms"] Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.016061 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.016613 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.516591825 +0000 UTC m=+151.504947042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.039239 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc"] Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.050912 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:15:44 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Dec 03 14:15:44 crc kubenswrapper[4751]: [+]process-running ok Dec 03 14:15:44 crc kubenswrapper[4751]: healthz check failed Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.050963 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.073317 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" event={"ID":"a35c0587-d301-49f5-b7a2-3d7d32efed87","Type":"ContainerStarted","Data":"6f30df3434f3cfb5448b6e387ac7a06cf37dcbb3f4af84c7514c15f0ca3b8ed2"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.073382 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" event={"ID":"a35c0587-d301-49f5-b7a2-3d7d32efed87","Type":"ContainerStarted","Data":"410c37f43ee331a0ead3512e9929f520dd75a0ab3b8ac84018c210056c7d1df7"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.090348 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7clk4"] Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.106033 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2n4v9"] Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.106095 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4ggm5"] Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.108639 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" event={"ID":"933e4b72-77fa-463e-9828-27d6fa9e0420","Type":"ContainerStarted","Data":"2528f7af7b9a0d3c1d00a9563516d61021ff66368d81b760ead3560137bbdef4"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.108685 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" event={"ID":"933e4b72-77fa-463e-9828-27d6fa9e0420","Type":"ContainerStarted","Data":"cd4b934f83a6ae502e720370a71ddb06e0969e2d3610df8982b7fe7c1495cfec"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.111113 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x"] Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.117664 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.117994 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.617981575 +0000 UTC m=+151.606336792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.133617 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-56xrz" event={"ID":"f55e95a4-8b8f-4875-9908-33b7a94f58ea","Type":"ContainerStarted","Data":"7f5d1f5ef8cd4443ac9af2b0208dee9e7493fb10264e95fc0f6e349b8e17a9cf"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.155552 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mvlmx" podStartSLOduration=128.155530361 podStartE2EDuration="2m8.155530361s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:44.114881643 +0000 UTC m=+151.103236860" watchObservedRunningTime="2025-12-03 14:15:44.155530361 +0000 UTC m=+151.143885578" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.168662 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" event={"ID":"105f1cdb-fae2-4fcb-8548-ae90bdcbb75f","Type":"ContainerStarted","Data":"ae36c450361566ee66b9e9c2834d26b7551a54a5bb48c72eafb55b00f3d7d307"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.180800 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" event={"ID":"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02","Type":"ContainerStarted","Data":"fcaa6f8356927f7bd4be6db2ead8568187c44d80bb758d00a066402a14843e2d"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.193073 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hvd8w" event={"ID":"6f4d710c-f625-4379-b5cf-cc5df715b6bd","Type":"ContainerStarted","Data":"9628d24807df0a90c91b0a21d331fd7909bbd185d9a153bc3a2d2c71117a27ed"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.195538 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5kgjd" podStartSLOduration=128.195527412 podStartE2EDuration="2m8.195527412s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:44.195192273 +0000 UTC m=+151.183547490" watchObservedRunningTime="2025-12-03 14:15:44.195527412 +0000 UTC m=+151.183882629" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.196437 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-56xrz" podStartSLOduration=7.196431356 podStartE2EDuration="7.196431356s" podCreationTimestamp="2025-12-03 14:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:44.156093086 +0000 UTC m=+151.144448303" watchObservedRunningTime="2025-12-03 14:15:44.196431356 +0000 UTC m=+151.184786573" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.206618 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" event={"ID":"66c310ae-0242-4438-9db3-f63ffd767976","Type":"ContainerStarted","Data":"0102e50848c814c94d401f950700c3fe57d8d21dd74cfba623833e42d54ec6d3"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.206659 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" event={"ID":"66c310ae-0242-4438-9db3-f63ffd767976","Type":"ContainerStarted","Data":"07b40f61335278bd86f4eedeee17a7a74b789d643b7e4c049e9fbd01033c6aca"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.218780 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.220043 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.720028292 +0000 UTC m=+151.708383499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.221694 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" event={"ID":"ea973fa0-4ecd-4b81-b6be-4c3d5e3b0043","Type":"ContainerStarted","Data":"bfd1b0a43f510f276f8ca3f19cef6b762c7fe992478c1a75a011b6f8afb36ac5"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.271522 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" event={"ID":"89c33472-8c62-4a71-9b17-697f9a0bbc65","Type":"ContainerStarted","Data":"dec7567ea245bd1fce30ed835b48b11cd55adb9f0255785e105814ce6a6c0aec"} Dec 03 14:15:44 crc kubenswrapper[4751]: W1203 14:15:44.318808 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda47fb721_c3c2_4bc9_8e26_9e99eee599d1.slice/crio-0b676d430e45f4d295c732793228ba65e833c68289dd32203f6cbc752996aa17 WatchSource:0}: Error finding container 0b676d430e45f4d295c732793228ba65e833c68289dd32203f6cbc752996aa17: Status 404 returned error can't find the container with id 0b676d430e45f4d295c732793228ba65e833c68289dd32203f6cbc752996aa17 Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.323600 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.323986 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.82397088 +0000 UTC m=+151.812326097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.341784 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hdhpp" podStartSLOduration=128.341754182 podStartE2EDuration="2m8.341754182s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:44.252756021 +0000 UTC m=+151.241111228" watchObservedRunningTime="2025-12-03 14:15:44.341754182 +0000 UTC m=+151.330109399" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.342981 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-trvft" podStartSLOduration=128.342974004 podStartE2EDuration="2m8.342974004s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:44.289760292 +0000 UTC m=+151.278115509" watchObservedRunningTime="2025-12-03 14:15:44.342974004 +0000 UTC m=+151.331329281" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.362895 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" event={"ID":"0fa4dfe3-601e-46c7-985c-af563252fd74","Type":"ContainerStarted","Data":"60dd57a9fd2aa52adc1a68ad995e4d0dd099c6d96bd54fa67cec182afd2d575d"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.363045 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" event={"ID":"0fa4dfe3-601e-46c7-985c-af563252fd74","Type":"ContainerStarted","Data":"5a9bf305a521fc255f3301e297832574807df6da79799b8af294d339af34395a"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.374722 4751 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pwknx container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]log ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]etcd ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/generic-apiserver-start-informers ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/max-in-flight-filter ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 03 14:15:44 crc kubenswrapper[4751]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 03 14:15:44 crc kubenswrapper[4751]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/project.openshift.io-projectcache ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/openshift.io-startinformers ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 03 14:15:44 crc kubenswrapper[4751]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 03 14:15:44 crc kubenswrapper[4751]: livez check failed Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.375261 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" podUID="90549328-bfce-4bd2-b1cc-651a2c9cd2e1" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.405731 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" event={"ID":"ec0d089c-a47a-4ef7-b422-756a4cf8487a","Type":"ContainerStarted","Data":"639fde26897dedbaf3014cadf6e8df8d3bab0c11a2a3cf1bb38a5d142208e9b8"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.428892 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.429918 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:44.92990249 +0000 UTC m=+151.918257707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.440428 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.452642 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pppn7" podStartSLOduration=128.452620383 podStartE2EDuration="2m8.452620383s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:44.381102316 +0000 UTC m=+151.369457543" watchObservedRunningTime="2025-12-03 14:15:44.452620383 +0000 UTC m=+151.440975600" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.469994 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.482760 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" event={"ID":"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3","Type":"ContainerStarted","Data":"9646b47a11ac942ef798f097d6c8ea0f91ad706a71db0c1cc0b89a51271300a2"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.490551 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw" event={"ID":"f8fb24b9-e854-4b1a-8ee7-27528e3544d2","Type":"ContainerStarted","Data":"a9cca88f20769ec8c7568c7a8935d9a75200360eca287ea09cad50f7e3ecd148"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.491619 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g4jcg" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.519741 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4qthd" podStartSLOduration=128.519718973 podStartE2EDuration="2m8.519718973s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:44.500916854 +0000 UTC m=+151.489272071" watchObservedRunningTime="2025-12-03 14:15:44.519718973 +0000 UTC m=+151.508074190" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.530253 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" event={"ID":"b901e1a9-281f-4928-b82b-4b7334b98f4d","Type":"ContainerStarted","Data":"4872df9a2dbec6ed222a740570c889d8a8a2c1d575322e3ddb258e2e747008e7"} Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.530288 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.534197 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.542434 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.042391215 +0000 UTC m=+152.030746432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.569556 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vq9xx" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.641387 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.641467 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ch7z" podStartSLOduration=128.641447843 podStartE2EDuration="2m8.641447843s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:44.640180679 +0000 UTC m=+151.628535916" watchObservedRunningTime="2025-12-03 14:15:44.641447843 +0000 UTC m=+151.629803060" Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.643285 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.143263651 +0000 UTC m=+152.131618868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.695369 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" podStartSLOduration=128.695304512 podStartE2EDuration="2m8.695304512s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:44.695079256 +0000 UTC m=+151.683434473" watchObservedRunningTime="2025-12-03 14:15:44.695304512 +0000 UTC m=+151.683659729" Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.744006 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.744379 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.244368053 +0000 UTC m=+152.232723270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.845618 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.845800 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.345773784 +0000 UTC m=+152.334129001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.845967 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.846262 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.346249846 +0000 UTC m=+152.334605063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.946692 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.947088 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.44704235 +0000 UTC m=+152.435397567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:44 crc kubenswrapper[4751]: I1203 14:15:44.947237 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:44 crc kubenswrapper[4751]: E1203 14:15:44.947788 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.44777698 +0000 UTC m=+152.436132197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.028995 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:15:45 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Dec 03 14:15:45 crc kubenswrapper[4751]: [+]process-running ok Dec 03 14:15:45 crc kubenswrapper[4751]: healthz check failed Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.029080 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.048446 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.048976 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.548953134 +0000 UTC m=+152.537308351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.150939 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.151278 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.651262338 +0000 UTC m=+152.639617555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.251399 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.251590 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.751557639 +0000 UTC m=+152.739912856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.251870 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.252200 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.752189586 +0000 UTC m=+152.740544803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.352685 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.353194 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.853179375 +0000 UTC m=+152.841534592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.454083 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.454538 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:45.954526763 +0000 UTC m=+152.942881980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.531007 4751 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-l7qh4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.531065 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" podUID="b901e1a9-281f-4928-b82b-4b7334b98f4d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.26:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.545779 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" event={"ID":"1955fd19-879b-4d2a-bf0a-f898d93835c5","Type":"ContainerStarted","Data":"b41fff17a71218bedc47f1e5613acc2813e2097160711ddcc48fce497d5557d5"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.545855 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" event={"ID":"1955fd19-879b-4d2a-bf0a-f898d93835c5","Type":"ContainerStarted","Data":"e9491c58d28c2100510f63e2395025fef044db12a8fb64e9a20d3e27c214d095"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.548177 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7nphl" event={"ID":"a47fb721-c3c2-4bc9-8e26-9e99eee599d1","Type":"ContainerStarted","Data":"7115c49bce45b66f7c2e712b941dffd870472f479955a47166d19b6a5e8e0b33"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.548232 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7nphl" event={"ID":"a47fb721-c3c2-4bc9-8e26-9e99eee599d1","Type":"ContainerStarted","Data":"0b676d430e45f4d295c732793228ba65e833c68289dd32203f6cbc752996aa17"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.552705 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7clk4" event={"ID":"e907ddd6-b1b8-4026-a500-9e066868ead1","Type":"ContainerStarted","Data":"95fce07b157176bad615fca41ba052612f43b4d0f41fe3b2c752613e199a7561"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.554992 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.555159 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.055143692 +0000 UTC m=+153.043498909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.555391 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.555711 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.055699567 +0000 UTC m=+153.044054784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.555975 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" event={"ID":"eb81b718-8bc4-4c3e-9ec6-472c62d377a2","Type":"ContainerStarted","Data":"390fd1508d98eae2ac14c39e28cc442f7ec0dcd6a68471b090edf84b59cfb904"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.556033 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" event={"ID":"eb81b718-8bc4-4c3e-9ec6-472c62d377a2","Type":"ContainerStarted","Data":"ce04abd08f7c07e7557629f37e5b625b032ffe8005bf78410dea8a73e332fabd"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.561279 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw" event={"ID":"f8fb24b9-e854-4b1a-8ee7-27528e3544d2","Type":"ContainerStarted","Data":"6808cd19dd86634de1cb384983ed7e39f3d677da2c70811a11ed83f2e0b3ab0b"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.564820 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" event={"ID":"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3","Type":"ContainerStarted","Data":"ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.567481 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" event={"ID":"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02","Type":"ContainerStarted","Data":"1a06f5749c9cd888a697aabdd9031e2bdd7b07c914015e34c8f3ca2faef21b16"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.568286 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.569382 4751 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4c725 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.569418 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.573369 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" event={"ID":"b9e18ab9-2082-481b-9e18-5da0f81303bf","Type":"ContainerStarted","Data":"f4b069a074ff50c7aeed7d03a87ecabd86de4252387c97ecdb1a67a0216828b7"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.573412 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" event={"ID":"b9e18ab9-2082-481b-9e18-5da0f81303bf","Type":"ContainerStarted","Data":"c0ca931fd64b9b63e60e67376201eebf8b1548976bfef41c3971d128db2c7b42"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.574526 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" podStartSLOduration=45.574508586 podStartE2EDuration="45.574508586s" podCreationTimestamp="2025-12-03 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:45.571289071 +0000 UTC m=+152.559644288" watchObservedRunningTime="2025-12-03 14:15:45.574508586 +0000 UTC m=+152.562863803" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.583769 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hvd8w" event={"ID":"6f4d710c-f625-4379-b5cf-cc5df715b6bd","Type":"ContainerStarted","Data":"e3b6e2fb656d14c5485c51e0e2a2f7aa438d585992d7fb3f7a2683dd46a12b5a"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.583989 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.586334 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" event={"ID":"f799c142-06b3-4b7e-ba2a-4f11adbb175e","Type":"ContainerStarted","Data":"789d2d33f3246d394550c9a67643212152f36225bde5db7f54fae7dae840a22a"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.586385 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" event={"ID":"f799c142-06b3-4b7e-ba2a-4f11adbb175e","Type":"ContainerStarted","Data":"444cdb7a4a257126b0c211d7e64dfe5aa5d532b40c228a009033547566398c07"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.590488 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-75nhw" podStartSLOduration=129.59047238 podStartE2EDuration="2m9.59047238s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:45.589183515 +0000 UTC m=+152.577538732" watchObservedRunningTime="2025-12-03 14:15:45.59047238 +0000 UTC m=+152.578827597" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.591967 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" event={"ID":"eec50b53-45d8-4fe1-b490-39be12772940","Type":"ContainerStarted","Data":"0eb46bb1ec2bdfbf300ac6803789f7d002144c9525947dd006797f397aeda647"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.592010 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" event={"ID":"eec50b53-45d8-4fe1-b490-39be12772940","Type":"ContainerStarted","Data":"a3cafb51161357d5c1fd1d5f2ff7a0006c07bb47172f4195236a813b31cc8938"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.599097 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" event={"ID":"ab66f440-24ed-4244-a972-63eee27b67b1","Type":"ContainerStarted","Data":"f0fb0b8d8f6f6a8612860d3061c4e71b99b8934d871d6395499e47afe20e243d"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.599200 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" event={"ID":"ab66f440-24ed-4244-a972-63eee27b67b1","Type":"ContainerStarted","Data":"0dc6bbd40a6b8475642743e50ef4a48e1ba95cbddc5b3e9c309452af085a295c"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.601578 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" event={"ID":"933e4b72-77fa-463e-9828-27d6fa9e0420","Type":"ContainerStarted","Data":"f49c3f82b228d889d0729a5dc51c0da40432c2cf97ba80e1078b36a5201156b8"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.602162 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.609428 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" event={"ID":"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66","Type":"ContainerStarted","Data":"e12f76cdc95f9d283e575b0b83014ccef2cdfd0484963e91f0be4cfed7368ad8"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.609464 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" event={"ID":"e2ca3799-4ab3-40a6-8b00-2c6e97cc2c66","Type":"ContainerStarted","Data":"16c2195970b9d6a331b97b045a7496ac34571819a6909abd0b143fc9d9017bac"} Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.635287 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" podStartSLOduration=129.635270208 podStartE2EDuration="2m9.635270208s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:45.633132812 +0000 UTC m=+152.621488029" watchObservedRunningTime="2025-12-03 14:15:45.635270208 +0000 UTC m=+152.623625425" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.653795 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4ggm5" podStartSLOduration=129.653779049 podStartE2EDuration="2m9.653779049s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:45.65268534 +0000 UTC m=+152.641040557" watchObservedRunningTime="2025-12-03 14:15:45.653779049 +0000 UTC m=+152.642134266" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.668424 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.678133 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.178072554 +0000 UTC m=+153.166427771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.705563 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" podStartSLOduration=129.705539813 podStartE2EDuration="2m9.705539813s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:45.693049711 +0000 UTC m=+152.681404928" watchObservedRunningTime="2025-12-03 14:15:45.705539813 +0000 UTC m=+152.693895030" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.718887 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l7qh4" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.728584 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hvd8w" podStartSLOduration=8.728568263 podStartE2EDuration="8.728568263s" podCreationTimestamp="2025-12-03 14:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:45.726312464 +0000 UTC m=+152.714667681" watchObservedRunningTime="2025-12-03 14:15:45.728568263 +0000 UTC m=+152.716923480" Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.773593 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.775147 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.275130229 +0000 UTC m=+153.263485496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.875206 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.875617 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.375602044 +0000 UTC m=+153.363957261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:45 crc kubenswrapper[4751]: I1203 14:15:45.976944 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:45 crc kubenswrapper[4751]: E1203 14:15:45.977306 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.477288462 +0000 UTC m=+153.465643679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.033090 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:15:46 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Dec 03 14:15:46 crc kubenswrapper[4751]: [+]process-running ok Dec 03 14:15:46 crc kubenswrapper[4751]: healthz check failed Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.033164 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.078015 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:46 crc kubenswrapper[4751]: E1203 14:15:46.078263 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.578248891 +0000 UTC m=+153.566604108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.179374 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:46 crc kubenswrapper[4751]: E1203 14:15:46.179710 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.679695592 +0000 UTC m=+153.668050819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.280352 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:46 crc kubenswrapper[4751]: E1203 14:15:46.281114 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.781096022 +0000 UTC m=+153.769451239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.382803 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:46 crc kubenswrapper[4751]: E1203 14:15:46.383199 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.883185711 +0000 UTC m=+153.871540938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2hknc" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.484212 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:46 crc kubenswrapper[4751]: E1203 14:15:46.484680 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:15:46.984623372 +0000 UTC m=+153.972978599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.486811 4751 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.502119 4751 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T14:15:46.48683024Z","Handler":null,"Name":""} Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.524017 4751 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.524058 4751 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.585726 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.606188 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.606224 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.628422 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7clk4" event={"ID":"e907ddd6-b1b8-4026-a500-9e066868ead1","Type":"ContainerStarted","Data":"6320e6288f430f38d6e24933a7bb3e43cf14cdd64039f09c8a7c79e6678afadc"} Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.628472 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7clk4" event={"ID":"e907ddd6-b1b8-4026-a500-9e066868ead1","Type":"ContainerStarted","Data":"40ce706971ad1d95bd6662a0db9ce017c6151f6a253ea2b7d0c7ec03f0cc318f"} Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.629751 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" event={"ID":"b9e18ab9-2082-481b-9e18-5da0f81303bf","Type":"ContainerStarted","Data":"a80ad7acf7f6859d1f18b4e87ca468f68edfe0cba9ae8b0b01a5d16b4625f9d2"} Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.631840 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" event={"ID":"ab66f440-24ed-4244-a972-63eee27b67b1","Type":"ContainerStarted","Data":"a3d5c39ab98ab990ce06ccd8cf32b3bc9b06be7dda9fa2ecce2bf2f22a153398"} Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.634206 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" event={"ID":"f799c142-06b3-4b7e-ba2a-4f11adbb175e","Type":"ContainerStarted","Data":"c60068b0ef9aec2eecdce285ade1552a986afba0015a70169a9f8bcaaa23a46d"} Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.635566 4751 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4c725 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.635622 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.636740 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.656055 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jgqz" podStartSLOduration=130.65603707 podStartE2EDuration="2m10.65603707s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:46.653930434 +0000 UTC m=+153.642285671" watchObservedRunningTime="2025-12-03 14:15:46.65603707 +0000 UTC m=+153.644392287" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.682028 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz6ms" podStartSLOduration=130.682014499 podStartE2EDuration="2m10.682014499s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:46.67715398 +0000 UTC m=+153.665509197" watchObservedRunningTime="2025-12-03 14:15:46.682014499 +0000 UTC m=+153.670369706" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.706677 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2hknc\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.730603 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7nphl" podStartSLOduration=9.730587257 podStartE2EDuration="9.730587257s" podCreationTimestamp="2025-12-03 14:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:46.730511915 +0000 UTC m=+153.718867132" watchObservedRunningTime="2025-12-03 14:15:46.730587257 +0000 UTC m=+153.718942464" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.758941 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-2n4v9" podStartSLOduration=130.758926259 podStartE2EDuration="2m10.758926259s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:46.758227811 +0000 UTC m=+153.746583028" watchObservedRunningTime="2025-12-03 14:15:46.758926259 +0000 UTC m=+153.747281476" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.788666 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.814960 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" podStartSLOduration=131.814940135 podStartE2EDuration="2m11.814940135s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:46.786508581 +0000 UTC m=+153.774863798" watchObservedRunningTime="2025-12-03 14:15:46.814940135 +0000 UTC m=+153.803295352" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.815992 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j5x" podStartSLOduration=130.815986593 podStartE2EDuration="2m10.815986593s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:46.814014821 +0000 UTC m=+153.802370038" watchObservedRunningTime="2025-12-03 14:15:46.815986593 +0000 UTC m=+153.804341810" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.847965 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" podStartSLOduration=130.847950801 podStartE2EDuration="2m10.847950801s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:46.843661377 +0000 UTC m=+153.832016594" watchObservedRunningTime="2025-12-03 14:15:46.847950801 +0000 UTC m=+153.836306018" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.961184 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:15:46 crc kubenswrapper[4751]: I1203 14:15:46.970557 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.028170 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:15:47 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Dec 03 14:15:47 crc kubenswrapper[4751]: [+]process-running ok Dec 03 14:15:47 crc kubenswrapper[4751]: healthz check failed Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.028221 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.057770 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.280224 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2hknc"] Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.326136 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.642170 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7clk4" event={"ID":"e907ddd6-b1b8-4026-a500-9e066868ead1","Type":"ContainerStarted","Data":"02a8aebd2061f8597feb0d3ee23939d0d7fe1324a7e20f5fab5bbc847bea0c3c"} Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.642535 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7clk4" event={"ID":"e907ddd6-b1b8-4026-a500-9e066868ead1","Type":"ContainerStarted","Data":"ee574c19cde67687a11efbe04e59329974b2ea756c36e93fa4cc94998f69f2f8"} Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.644055 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" event={"ID":"2086abe6-48e1-4593-9789-b098b9b3142d","Type":"ContainerStarted","Data":"b44b70a3481533fc57c66c73b51ca98789982dc423c05e97e1efc91dbb2e9954"} Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.644099 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.644111 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" event={"ID":"2086abe6-48e1-4593-9789-b098b9b3142d","Type":"ContainerStarted","Data":"3aa699ccdb732b58dd927b0a6ab9c956dab41ff2aea1795ae78328b77f06fe8f"} Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.646401 4751 generic.go:334] "Generic (PLEG): container finished" podID="eb81b718-8bc4-4c3e-9ec6-472c62d377a2" containerID="390fd1508d98eae2ac14c39e28cc442f7ec0dcd6a68471b090edf84b59cfb904" exitCode=0 Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.646792 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" event={"ID":"eb81b718-8bc4-4c3e-9ec6-472c62d377a2","Type":"ContainerDied","Data":"390fd1508d98eae2ac14c39e28cc442f7ec0dcd6a68471b090edf84b59cfb904"} Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.649108 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.670179 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7clk4" podStartSLOduration=10.670162865 podStartE2EDuration="10.670162865s" podCreationTimestamp="2025-12-03 14:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:47.667881544 +0000 UTC m=+154.656236761" watchObservedRunningTime="2025-12-03 14:15:47.670162865 +0000 UTC m=+154.658518082" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.701938 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" podStartSLOduration=131.701919987 podStartE2EDuration="2m11.701919987s" podCreationTimestamp="2025-12-03 14:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:47.686705164 +0000 UTC m=+154.675060381" watchObservedRunningTime="2025-12-03 14:15:47.701919987 +0000 UTC m=+154.690275204" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.736956 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4brfr"] Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.737949 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.739918 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.785799 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4brfr"] Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.816752 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmcn2\" (UniqueName: \"kubernetes.io/projected/0878495c-85e0-469c-bee0-a4f6ce70d873-kube-api-access-lmcn2\") pod \"community-operators-4brfr\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.816788 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-catalog-content\") pod \"community-operators-4brfr\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.816924 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-utilities\") pod \"community-operators-4brfr\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.917679 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-catalog-content\") pod \"community-operators-4brfr\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.917737 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-utilities\") pod \"community-operators-4brfr\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.917817 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmcn2\" (UniqueName: \"kubernetes.io/projected/0878495c-85e0-469c-bee0-a4f6ce70d873-kube-api-access-lmcn2\") pod \"community-operators-4brfr\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.918219 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-catalog-content\") pod \"community-operators-4brfr\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.918510 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-utilities\") pod \"community-operators-4brfr\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.930823 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cclkx"] Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.931749 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.940472 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.944984 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cclkx"] Dec 03 14:15:47 crc kubenswrapper[4751]: I1203 14:15:47.949174 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmcn2\" (UniqueName: \"kubernetes.io/projected/0878495c-85e0-469c-bee0-a4f6ce70d873-kube-api-access-lmcn2\") pod \"community-operators-4brfr\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.019245 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-catalog-content\") pod \"certified-operators-cclkx\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.019378 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-utilities\") pod \"certified-operators-cclkx\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.019523 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sbdm\" (UniqueName: \"kubernetes.io/projected/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-kube-api-access-2sbdm\") pod \"certified-operators-cclkx\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.026757 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:15:48 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Dec 03 14:15:48 crc kubenswrapper[4751]: [+]process-running ok Dec 03 14:15:48 crc kubenswrapper[4751]: healthz check failed Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.027070 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.053730 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.120388 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-utilities\") pod \"certified-operators-cclkx\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.120519 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sbdm\" (UniqueName: \"kubernetes.io/projected/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-kube-api-access-2sbdm\") pod \"certified-operators-cclkx\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.120948 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-catalog-content\") pod \"certified-operators-cclkx\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.121138 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-utilities\") pod \"certified-operators-cclkx\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.121408 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-catalog-content\") pod \"certified-operators-cclkx\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.139039 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jntcp"] Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.140216 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.151279 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jntcp"] Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.158988 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.170934 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mc2fd" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.171003 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pwknx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.176537 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sbdm\" (UniqueName: \"kubernetes.io/projected/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-kube-api-access-2sbdm\") pod \"certified-operators-cclkx\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.222091 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-catalog-content\") pod \"community-operators-jntcp\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.222141 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcvlv\" (UniqueName: \"kubernetes.io/projected/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-kube-api-access-pcvlv\") pod \"community-operators-jntcp\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.222172 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-utilities\") pod \"community-operators-jntcp\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.246224 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.326176 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-catalog-content\") pod \"community-operators-jntcp\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.326224 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcvlv\" (UniqueName: \"kubernetes.io/projected/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-kube-api-access-pcvlv\") pod \"community-operators-jntcp\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.326309 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-utilities\") pod \"community-operators-jntcp\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.328499 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-catalog-content\") pod \"community-operators-jntcp\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.329670 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-utilities\") pod \"community-operators-jntcp\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.338729 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qqbc4"] Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.340371 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.347788 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qqbc4"] Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.356636 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcvlv\" (UniqueName: \"kubernetes.io/projected/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-kube-api-access-pcvlv\") pod \"community-operators-jntcp\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.392290 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.392354 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.393572 4751 patch_prober.go:28] interesting pod/console-f9d7485db-djkdm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.393628 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-djkdm" podUID="a112208b-e069-48e0-8bc4-d6c4e79052fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.427248 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs6sf\" (UniqueName: \"kubernetes.io/projected/0e46a326-4633-4c4b-aac9-700b969ef961-kube-api-access-xs6sf\") pod \"certified-operators-qqbc4\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.427294 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-catalog-content\") pod \"certified-operators-qqbc4\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.427385 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-utilities\") pod \"certified-operators-qqbc4\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.467103 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.522135 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cclkx"] Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.528939 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs6sf\" (UniqueName: \"kubernetes.io/projected/0e46a326-4633-4c4b-aac9-700b969ef961-kube-api-access-xs6sf\") pod \"certified-operators-qqbc4\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.528987 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-catalog-content\") pod \"certified-operators-qqbc4\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.529047 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-utilities\") pod \"certified-operators-qqbc4\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.530175 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-utilities\") pod \"certified-operators-qqbc4\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.530396 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-catalog-content\") pod \"certified-operators-qqbc4\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.548288 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs6sf\" (UniqueName: \"kubernetes.io/projected/0e46a326-4633-4c4b-aac9-700b969ef961-kube-api-access-xs6sf\") pod \"certified-operators-qqbc4\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.572055 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4brfr"] Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.659054 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.662679 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cclkx" event={"ID":"d5bfbffc-4818-4710-9c57-a2a4f298bfe2","Type":"ContainerStarted","Data":"91678aa47dfdc3d655abb9bab46b5368f515ddc01aa027910de2213d8b658f67"} Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.664134 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4brfr" event={"ID":"0878495c-85e0-469c-bee0-a4f6ce70d873","Type":"ContainerStarted","Data":"1f5d0977df47123d85185c3245d15b76bf60f8b94a31f2caf48ea634a27e5eb3"} Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.819443 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jntcp"] Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.896196 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qqbc4"] Dec 03 14:15:48 crc kubenswrapper[4751]: I1203 14:15:48.935724 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.027643 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.028422 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.034735 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.039303 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl4fx\" (UniqueName: \"kubernetes.io/projected/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-kube-api-access-jl4fx\") pod \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.039357 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-config-volume\") pod \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.039404 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-secret-volume\") pod \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\" (UID: \"eb81b718-8bc4-4c3e-9ec6-472c62d377a2\") " Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.041992 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-config-volume" (OuterVolumeSpecName: "config-volume") pod "eb81b718-8bc4-4c3e-9ec6-472c62d377a2" (UID: "eb81b718-8bc4-4c3e-9ec6-472c62d377a2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.045422 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-kube-api-access-jl4fx" (OuterVolumeSpecName: "kube-api-access-jl4fx") pod "eb81b718-8bc4-4c3e-9ec6-472c62d377a2" (UID: "eb81b718-8bc4-4c3e-9ec6-472c62d377a2"). InnerVolumeSpecName "kube-api-access-jl4fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.046856 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eb81b718-8bc4-4c3e-9ec6-472c62d377a2" (UID: "eb81b718-8bc4-4c3e-9ec6-472c62d377a2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.141237 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl4fx\" (UniqueName: \"kubernetes.io/projected/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-kube-api-access-jl4fx\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.141796 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.141810 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb81b718-8bc4-4c3e-9ec6-472c62d377a2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.670863 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerID="116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea" exitCode=0 Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.670944 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cclkx" event={"ID":"d5bfbffc-4818-4710-9c57-a2a4f298bfe2","Type":"ContainerDied","Data":"116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea"} Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.672517 4751 generic.go:334] "Generic (PLEG): container finished" podID="0e46a326-4633-4c4b-aac9-700b969ef961" containerID="5143b34b8a2914f7369992f1349d25227c33d52373093041214c1e7198ba2b6f" exitCode=0 Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.672580 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqbc4" event={"ID":"0e46a326-4633-4c4b-aac9-700b969ef961","Type":"ContainerDied","Data":"5143b34b8a2914f7369992f1349d25227c33d52373093041214c1e7198ba2b6f"} Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.672605 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqbc4" event={"ID":"0e46a326-4633-4c4b-aac9-700b969ef961","Type":"ContainerStarted","Data":"ecfbecd04f4be53aede60a3b82b3cac2062d27163bfc4c4460fb329323de5fb0"} Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.673500 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.674041 4751 generic.go:334] "Generic (PLEG): container finished" podID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerID="1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f" exitCode=0 Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.674091 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jntcp" event={"ID":"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0","Type":"ContainerDied","Data":"1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f"} Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.674110 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jntcp" event={"ID":"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0","Type":"ContainerStarted","Data":"770c6be5652b14896860b43c989e9f28b12f1b668df6da7b91b2d2997b8162de"} Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.676435 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.676431 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g" event={"ID":"eb81b718-8bc4-4c3e-9ec6-472c62d377a2","Type":"ContainerDied","Data":"ce04abd08f7c07e7557629f37e5b625b032ffe8005bf78410dea8a73e332fabd"} Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.676474 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce04abd08f7c07e7557629f37e5b625b032ffe8005bf78410dea8a73e332fabd" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.679045 4751 generic.go:334] "Generic (PLEG): container finished" podID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerID="d93a8983f54e13cd7a0c70c9f936831a5af51e553c89b71af10c685faeb8e8c0" exitCode=0 Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.679100 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4brfr" event={"ID":"0878495c-85e0-469c-bee0-a4f6ce70d873","Type":"ContainerDied","Data":"d93a8983f54e13cd7a0c70c9f936831a5af51e553c89b71af10c685faeb8e8c0"} Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.735377 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g592f"] Dec 03 14:15:49 crc kubenswrapper[4751]: E1203 14:15:49.735787 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb81b718-8bc4-4c3e-9ec6-472c62d377a2" containerName="collect-profiles" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.735800 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb81b718-8bc4-4c3e-9ec6-472c62d377a2" containerName="collect-profiles" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.735892 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb81b718-8bc4-4c3e-9ec6-472c62d377a2" containerName="collect-profiles" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.736625 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.738903 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.758771 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g592f"] Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.849598 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqzkt\" (UniqueName: \"kubernetes.io/projected/3004e0b4-ba13-417b-88ef-439481ef93f4-kube-api-access-kqzkt\") pod \"redhat-marketplace-g592f\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.849764 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-utilities\") pod \"redhat-marketplace-g592f\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.849858 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-catalog-content\") pod \"redhat-marketplace-g592f\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.951224 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-utilities\") pod \"redhat-marketplace-g592f\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.951278 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-catalog-content\") pod \"redhat-marketplace-g592f\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.951321 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqzkt\" (UniqueName: \"kubernetes.io/projected/3004e0b4-ba13-417b-88ef-439481ef93f4-kube-api-access-kqzkt\") pod \"redhat-marketplace-g592f\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.951865 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-utilities\") pod \"redhat-marketplace-g592f\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.951872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-catalog-content\") pod \"redhat-marketplace-g592f\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:49 crc kubenswrapper[4751]: I1203 14:15:49.975170 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqzkt\" (UniqueName: \"kubernetes.io/projected/3004e0b4-ba13-417b-88ef-439481ef93f4-kube-api-access-kqzkt\") pod \"redhat-marketplace-g592f\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.052663 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.139158 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qrskc"] Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.140311 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.158900 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrskc"] Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.191653 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.192298 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.197012 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.197203 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.208103 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.255703 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.255756 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-catalog-content\") pod \"redhat-marketplace-qrskc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.255839 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bt5h\" (UniqueName: \"kubernetes.io/projected/bddd0747-ddb2-40ec-8912-6941942460bc-kube-api-access-4bt5h\") pod \"redhat-marketplace-qrskc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.256026 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.256107 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-utilities\") pod \"redhat-marketplace-qrskc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.317110 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g592f"] Dec 03 14:15:50 crc kubenswrapper[4751]: W1203 14:15:50.356061 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3004e0b4_ba13_417b_88ef_439481ef93f4.slice/crio-603a4cc6178b2cde26d2a43f1654192467625813d3337fe9b8b634dcdb09d6a2 WatchSource:0}: Error finding container 603a4cc6178b2cde26d2a43f1654192467625813d3337fe9b8b634dcdb09d6a2: Status 404 returned error can't find the container with id 603a4cc6178b2cde26d2a43f1654192467625813d3337fe9b8b634dcdb09d6a2 Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.357304 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bt5h\" (UniqueName: \"kubernetes.io/projected/bddd0747-ddb2-40ec-8912-6941942460bc-kube-api-access-4bt5h\") pod \"redhat-marketplace-qrskc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.357380 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.357408 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-utilities\") pod \"redhat-marketplace-qrskc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.357434 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.357451 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-catalog-content\") pod \"redhat-marketplace-qrskc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.357664 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.357927 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-catalog-content\") pod \"redhat-marketplace-qrskc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.358002 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-utilities\") pod \"redhat-marketplace-qrskc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.382848 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.383482 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bt5h\" (UniqueName: \"kubernetes.io/projected/bddd0747-ddb2-40ec-8912-6941942460bc-kube-api-access-4bt5h\") pod \"redhat-marketplace-qrskc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.456413 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.519734 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.684308 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g592f" event={"ID":"3004e0b4-ba13-417b-88ef-439481ef93f4","Type":"ContainerStarted","Data":"603a4cc6178b2cde26d2a43f1654192467625813d3337fe9b8b634dcdb09d6a2"} Dec 03 14:15:50 crc kubenswrapper[4751]: I1203 14:15:50.951200 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrskc"] Dec 03 14:15:50 crc kubenswrapper[4751]: W1203 14:15:50.961301 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbddd0747_ddb2_40ec_8912_6941942460bc.slice/crio-76dde61231b543f2da26493e73588b522f5e9f49444224ef8e94503f7c5d4119 WatchSource:0}: Error finding container 76dde61231b543f2da26493e73588b522f5e9f49444224ef8e94503f7c5d4119: Status 404 returned error can't find the container with id 76dde61231b543f2da26493e73588b522f5e9f49444224ef8e94503f7c5d4119 Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.002770 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 14:15:51 crc kubenswrapper[4751]: W1203 14:15:51.008703 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod60859cef_2d02_45a7_9f4a_d03bb3a6118a.slice/crio-cab903dd9456a2827dbfcac5148a74eddab68ec799a56829cdc0f8fe3200f682 WatchSource:0}: Error finding container cab903dd9456a2827dbfcac5148a74eddab68ec799a56829cdc0f8fe3200f682: Status 404 returned error can't find the container with id cab903dd9456a2827dbfcac5148a74eddab68ec799a56829cdc0f8fe3200f682 Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.134382 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-drt67"] Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.135640 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.137708 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.142625 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-drt67"] Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.238361 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.239317 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.240125 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.241167 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.242817 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.271472 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-utilities\") pod \"redhat-operators-drt67\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.271635 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb5p5\" (UniqueName: \"kubernetes.io/projected/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-kube-api-access-sb5p5\") pod \"redhat-operators-drt67\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.271692 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-catalog-content\") pod \"redhat-operators-drt67\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.372525 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.372653 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.372675 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-utilities\") pod \"redhat-operators-drt67\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.372753 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb5p5\" (UniqueName: \"kubernetes.io/projected/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-kube-api-access-sb5p5\") pod \"redhat-operators-drt67\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.372774 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-catalog-content\") pod \"redhat-operators-drt67\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.373194 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-catalog-content\") pod \"redhat-operators-drt67\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.374279 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-utilities\") pod \"redhat-operators-drt67\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.381800 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.387967 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-js4nc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.415649 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb5p5\" (UniqueName: \"kubernetes.io/projected/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-kube-api-access-sb5p5\") pod \"redhat-operators-drt67\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.473866 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.473956 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.474144 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.494471 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.504818 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.532849 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lskvp"] Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.534250 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.549679 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lskvp"] Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.566440 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.675644 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-catalog-content\") pod \"redhat-operators-lskvp\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.675742 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-utilities\") pod \"redhat-operators-lskvp\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.675809 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shdfn\" (UniqueName: \"kubernetes.io/projected/b3328e22-4ad1-4815-b477-b015fc4dcf27-kube-api-access-shdfn\") pod \"redhat-operators-lskvp\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.691890 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"60859cef-2d02-45a7-9f4a-d03bb3a6118a","Type":"ContainerStarted","Data":"65ec66f3c13f3c5d088b0f613f4c65f650dbec42f0f83e9f9c8cc07713a0a362"} Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.691940 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"60859cef-2d02-45a7-9f4a-d03bb3a6118a","Type":"ContainerStarted","Data":"cab903dd9456a2827dbfcac5148a74eddab68ec799a56829cdc0f8fe3200f682"} Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.694139 4751 generic.go:334] "Generic (PLEG): container finished" podID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerID="028de8f73db9fd7a425b4111b9751169d71e1834dbe8b307942569d8a882489a" exitCode=0 Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.694230 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g592f" event={"ID":"3004e0b4-ba13-417b-88ef-439481ef93f4","Type":"ContainerDied","Data":"028de8f73db9fd7a425b4111b9751169d71e1834dbe8b307942569d8a882489a"} Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.696173 4751 generic.go:334] "Generic (PLEG): container finished" podID="bddd0747-ddb2-40ec-8912-6941942460bc" containerID="7b5ca8fe3de26b8a3d3bb6dc37734183a4406d5d4c30c0f1be2eecd5f4e9a5d1" exitCode=0 Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.696268 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrskc" event={"ID":"bddd0747-ddb2-40ec-8912-6941942460bc","Type":"ContainerDied","Data":"7b5ca8fe3de26b8a3d3bb6dc37734183a4406d5d4c30c0f1be2eecd5f4e9a5d1"} Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.696302 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrskc" event={"ID":"bddd0747-ddb2-40ec-8912-6941942460bc","Type":"ContainerStarted","Data":"76dde61231b543f2da26493e73588b522f5e9f49444224ef8e94503f7c5d4119"} Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.776969 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-utilities\") pod \"redhat-operators-lskvp\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.777035 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shdfn\" (UniqueName: \"kubernetes.io/projected/b3328e22-4ad1-4815-b477-b015fc4dcf27-kube-api-access-shdfn\") pod \"redhat-operators-lskvp\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.777070 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-catalog-content\") pod \"redhat-operators-lskvp\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.777814 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-catalog-content\") pod \"redhat-operators-lskvp\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.778105 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-utilities\") pod \"redhat-operators-lskvp\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.796274 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shdfn\" (UniqueName: \"kubernetes.io/projected/b3328e22-4ad1-4815-b477-b015fc4dcf27-kube-api-access-shdfn\") pod \"redhat-operators-lskvp\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.811743 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-drt67"] Dec 03 14:15:51 crc kubenswrapper[4751]: W1203 14:15:51.823049 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b5fc30_18a3_4b4b_861f_4312c07eaa7b.slice/crio-fcca06e807b744e351f5f2373d88344a6ca6ed36dcf9a5e77ff1a68f0ecbdff4 WatchSource:0}: Error finding container fcca06e807b744e351f5f2373d88344a6ca6ed36dcf9a5e77ff1a68f0ecbdff4: Status 404 returned error can't find the container with id fcca06e807b744e351f5f2373d88344a6ca6ed36dcf9a5e77ff1a68f0ecbdff4 Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.846712 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 14:15:51 crc kubenswrapper[4751]: W1203 14:15:51.852595 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9bb1ce04_8c8a_4dad_b6dd_abdefd0d37c9.slice/crio-c1f7b0c6975e987f0f5bd54f85bf215864e9a02d94e5232718206f96f782111c WatchSource:0}: Error finding container c1f7b0c6975e987f0f5bd54f85bf215864e9a02d94e5232718206f96f782111c: Status 404 returned error can't find the container with id c1f7b0c6975e987f0f5bd54f85bf215864e9a02d94e5232718206f96f782111c Dec 03 14:15:51 crc kubenswrapper[4751]: I1203 14:15:51.876760 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:15:52 crc kubenswrapper[4751]: I1203 14:15:52.285225 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lskvp"] Dec 03 14:15:52 crc kubenswrapper[4751]: W1203 14:15:52.298768 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3328e22_4ad1_4815_b477_b015fc4dcf27.slice/crio-27b59928f94d9cb2a9a43438b54439dc305d03ceebfb0d40b595dbe49540cfed WatchSource:0}: Error finding container 27b59928f94d9cb2a9a43438b54439dc305d03ceebfb0d40b595dbe49540cfed: Status 404 returned error can't find the container with id 27b59928f94d9cb2a9a43438b54439dc305d03ceebfb0d40b595dbe49540cfed Dec 03 14:15:52 crc kubenswrapper[4751]: I1203 14:15:52.707208 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drt67" event={"ID":"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b","Type":"ContainerStarted","Data":"fcca06e807b744e351f5f2373d88344a6ca6ed36dcf9a5e77ff1a68f0ecbdff4"} Dec 03 14:15:52 crc kubenswrapper[4751]: I1203 14:15:52.709294 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskvp" event={"ID":"b3328e22-4ad1-4815-b477-b015fc4dcf27","Type":"ContainerStarted","Data":"27b59928f94d9cb2a9a43438b54439dc305d03ceebfb0d40b595dbe49540cfed"} Dec 03 14:15:52 crc kubenswrapper[4751]: I1203 14:15:52.710956 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9","Type":"ContainerStarted","Data":"c1f7b0c6975e987f0f5bd54f85bf215864e9a02d94e5232718206f96f782111c"} Dec 03 14:15:52 crc kubenswrapper[4751]: I1203 14:15:52.757232 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.757217264 podStartE2EDuration="2.757217264s" podCreationTimestamp="2025-12-03 14:15:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:52.754117462 +0000 UTC m=+159.742472679" watchObservedRunningTime="2025-12-03 14:15:52.757217264 +0000 UTC m=+159.745572471" Dec 03 14:15:53 crc kubenswrapper[4751]: I1203 14:15:53.721425 4751 generic.go:334] "Generic (PLEG): container finished" podID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerID="001a2ff547e688b3191e9c60fc2b6fae094ac608a5de278ed037761bbdd29d13" exitCode=0 Dec 03 14:15:53 crc kubenswrapper[4751]: I1203 14:15:53.721524 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskvp" event={"ID":"b3328e22-4ad1-4815-b477-b015fc4dcf27","Type":"ContainerDied","Data":"001a2ff547e688b3191e9c60fc2b6fae094ac608a5de278ed037761bbdd29d13"} Dec 03 14:15:53 crc kubenswrapper[4751]: I1203 14:15:53.723768 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9","Type":"ContainerStarted","Data":"a8e87774c3d0a45734f7af5d99f18b4813d9e43c5b24c30cbea2ae6c36058c1d"} Dec 03 14:15:53 crc kubenswrapper[4751]: I1203 14:15:53.727425 4751 generic.go:334] "Generic (PLEG): container finished" podID="60859cef-2d02-45a7-9f4a-d03bb3a6118a" containerID="65ec66f3c13f3c5d088b0f613f4c65f650dbec42f0f83e9f9c8cc07713a0a362" exitCode=0 Dec 03 14:15:53 crc kubenswrapper[4751]: I1203 14:15:53.727503 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"60859cef-2d02-45a7-9f4a-d03bb3a6118a","Type":"ContainerDied","Data":"65ec66f3c13f3c5d088b0f613f4c65f650dbec42f0f83e9f9c8cc07713a0a362"} Dec 03 14:15:53 crc kubenswrapper[4751]: I1203 14:15:53.730893 4751 generic.go:334] "Generic (PLEG): container finished" podID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerID="e48c0612cda77cbbebded1b5ab9b043cc4ccbed631268104f09f53054051cdbb" exitCode=0 Dec 03 14:15:53 crc kubenswrapper[4751]: I1203 14:15:53.730971 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drt67" event={"ID":"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b","Type":"ContainerDied","Data":"e48c0612cda77cbbebded1b5ab9b043cc4ccbed631268104f09f53054051cdbb"} Dec 03 14:15:53 crc kubenswrapper[4751]: I1203 14:15:53.790164 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.790147819 podStartE2EDuration="2.790147819s" podCreationTimestamp="2025-12-03 14:15:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:15:53.78568321 +0000 UTC m=+160.774038437" watchObservedRunningTime="2025-12-03 14:15:53.790147819 +0000 UTC m=+160.778503036" Dec 03 14:15:54 crc kubenswrapper[4751]: I1203 14:15:54.739299 4751 generic.go:334] "Generic (PLEG): container finished" podID="9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9" containerID="a8e87774c3d0a45734f7af5d99f18b4813d9e43c5b24c30cbea2ae6c36058c1d" exitCode=0 Dec 03 14:15:54 crc kubenswrapper[4751]: I1203 14:15:54.739365 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9","Type":"ContainerDied","Data":"a8e87774c3d0a45734f7af5d99f18b4813d9e43c5b24c30cbea2ae6c36058c1d"} Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.017462 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.141373 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kube-api-access\") pod \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\" (UID: \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\") " Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.141438 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kubelet-dir\") pod \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\" (UID: \"60859cef-2d02-45a7-9f4a-d03bb3a6118a\") " Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.141705 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "60859cef-2d02-45a7-9f4a-d03bb3a6118a" (UID: "60859cef-2d02-45a7-9f4a-d03bb3a6118a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.152881 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "60859cef-2d02-45a7-9f4a-d03bb3a6118a" (UID: "60859cef-2d02-45a7-9f4a-d03bb3a6118a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.243406 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.243462 4751 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60859cef-2d02-45a7-9f4a-d03bb3a6118a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.748128 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.748129 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"60859cef-2d02-45a7-9f4a-d03bb3a6118a","Type":"ContainerDied","Data":"cab903dd9456a2827dbfcac5148a74eddab68ec799a56829cdc0f8fe3200f682"} Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.748170 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cab903dd9456a2827dbfcac5148a74eddab68ec799a56829cdc0f8fe3200f682" Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.950184 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hvd8w" Dec 03 14:15:55 crc kubenswrapper[4751]: I1203 14:15:55.989929 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:56 crc kubenswrapper[4751]: I1203 14:15:56.053777 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kubelet-dir\") pod \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\" (UID: \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\") " Dec 03 14:15:56 crc kubenswrapper[4751]: I1203 14:15:56.053876 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kube-api-access\") pod \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\" (UID: \"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9\") " Dec 03 14:15:56 crc kubenswrapper[4751]: I1203 14:15:56.053895 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9" (UID: "9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:15:56 crc kubenswrapper[4751]: I1203 14:15:56.054217 4751 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:56 crc kubenswrapper[4751]: I1203 14:15:56.059058 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9" (UID: "9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:15:56 crc kubenswrapper[4751]: I1203 14:15:56.155468 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:56 crc kubenswrapper[4751]: I1203 14:15:56.755784 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9","Type":"ContainerDied","Data":"c1f7b0c6975e987f0f5bd54f85bf215864e9a02d94e5232718206f96f782111c"} Dec 03 14:15:56 crc kubenswrapper[4751]: I1203 14:15:56.756003 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1f7b0c6975e987f0f5bd54f85bf215864e9a02d94e5232718206f96f782111c" Dec 03 14:15:56 crc kubenswrapper[4751]: I1203 14:15:56.756078 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:15:58 crc kubenswrapper[4751]: I1203 14:15:58.081506 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:58 crc kubenswrapper[4751]: I1203 14:15:58.091480 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45fb8744-4cb9-4138-8310-c02f7c6a2941-metrics-certs\") pod \"network-metrics-daemon-zgqdp\" (UID: \"45fb8744-4cb9-4138-8310-c02f7c6a2941\") " pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:58 crc kubenswrapper[4751]: I1203 14:15:58.150386 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zgqdp" Dec 03 14:15:58 crc kubenswrapper[4751]: I1203 14:15:58.391976 4751 patch_prober.go:28] interesting pod/console-f9d7485db-djkdm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 03 14:15:58 crc kubenswrapper[4751]: I1203 14:15:58.392234 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-djkdm" podUID="a112208b-e069-48e0-8bc4-d6c4e79052fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 03 14:16:05 crc kubenswrapper[4751]: I1203 14:16:05.819806 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:16:05 crc kubenswrapper[4751]: I1203 14:16:05.820355 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:16:06 crc kubenswrapper[4751]: I1203 14:16:06.978267 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:16:08 crc kubenswrapper[4751]: I1203 14:16:08.396542 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:16:08 crc kubenswrapper[4751]: I1203 14:16:08.403233 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:16:14 crc kubenswrapper[4751]: E1203 14:16:14.638321 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 14:16:14 crc kubenswrapper[4751]: E1203 14:16:14.639148 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sbdm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cclkx_openshift-marketplace(d5bfbffc-4818-4710-9c57-a2a4f298bfe2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:16:14 crc kubenswrapper[4751]: E1203 14:16:14.640634 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cclkx" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" Dec 03 14:16:14 crc kubenswrapper[4751]: E1203 14:16:14.698342 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 14:16:14 crc kubenswrapper[4751]: E1203 14:16:14.698505 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs6sf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qqbc4_openshift-marketplace(0e46a326-4633-4c4b-aac9-700b969ef961): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:16:14 crc kubenswrapper[4751]: E1203 14:16:14.699693 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qqbc4" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" Dec 03 14:16:18 crc kubenswrapper[4751]: E1203 14:16:18.776239 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cclkx" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" Dec 03 14:16:18 crc kubenswrapper[4751]: E1203 14:16:18.776441 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qqbc4" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" Dec 03 14:16:18 crc kubenswrapper[4751]: E1203 14:16:18.784406 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 14:16:18 crc kubenswrapper[4751]: E1203 14:16:18.784689 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kqzkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-g592f_openshift-marketplace(3004e0b4-ba13-417b-88ef-439481ef93f4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:16:18 crc kubenswrapper[4751]: E1203 14:16:18.786462 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-g592f" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" Dec 03 14:16:18 crc kubenswrapper[4751]: I1203 14:16:18.958228 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zgqdp"] Dec 03 14:16:19 crc kubenswrapper[4751]: I1203 14:16:19.564671 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:16:19 crc kubenswrapper[4751]: E1203 14:16:19.711631 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 14:16:19 crc kubenswrapper[4751]: E1203 14:16:19.711828 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pcvlv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jntcp_openshift-marketplace(b17b5e87-a2c1-446f-ac2f-52e7f6b608c0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:16:19 crc kubenswrapper[4751]: E1203 14:16:19.712989 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jntcp" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" Dec 03 14:16:20 crc kubenswrapper[4751]: E1203 14:16:20.084867 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 14:16:20 crc kubenswrapper[4751]: E1203 14:16:20.085023 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lmcn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4brfr_openshift-marketplace(0878495c-85e0-469c-bee0-a4f6ce70d873): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:16:20 crc kubenswrapper[4751]: E1203 14:16:20.087247 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4brfr" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" Dec 03 14:16:21 crc kubenswrapper[4751]: I1203 14:16:21.415629 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2wgh" Dec 03 14:16:24 crc kubenswrapper[4751]: E1203 14:16:24.614492 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4brfr" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" Dec 03 14:16:24 crc kubenswrapper[4751]: E1203 14:16:24.614543 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jntcp" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" Dec 03 14:16:24 crc kubenswrapper[4751]: E1203 14:16:24.614571 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-g592f" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" Dec 03 14:16:24 crc kubenswrapper[4751]: E1203 14:16:24.650843 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 14:16:24 crc kubenswrapper[4751]: E1203 14:16:24.651172 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-shdfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lskvp_openshift-marketplace(b3328e22-4ad1-4815-b477-b015fc4dcf27): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:16:24 crc kubenswrapper[4751]: E1203 14:16:24.652403 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lskvp" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" Dec 03 14:16:24 crc kubenswrapper[4751]: I1203 14:16:24.926482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" event={"ID":"45fb8744-4cb9-4138-8310-c02f7c6a2941","Type":"ContainerStarted","Data":"c90e374ccfda642cb802fafd0019cecade83979f9149d7a1b770d052761e9b09"} Dec 03 14:16:24 crc kubenswrapper[4751]: E1203 14:16:24.928143 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lskvp" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" Dec 03 14:16:25 crc kubenswrapper[4751]: E1203 14:16:25.864774 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 14:16:25 crc kubenswrapper[4751]: E1203 14:16:25.865291 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4bt5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qrskc_openshift-marketplace(bddd0747-ddb2-40ec-8912-6941942460bc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:16:25 crc kubenswrapper[4751]: E1203 14:16:25.866484 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qrskc" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" Dec 03 14:16:25 crc kubenswrapper[4751]: I1203 14:16:25.937845 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" event={"ID":"45fb8744-4cb9-4138-8310-c02f7c6a2941","Type":"ContainerStarted","Data":"e264d9036dc726410d3a0f4407892ed03eda9161596faf299a2928c255a72b4e"} Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.427117 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 14:16:26 crc kubenswrapper[4751]: E1203 14:16:26.427410 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60859cef-2d02-45a7-9f4a-d03bb3a6118a" containerName="pruner" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.427426 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="60859cef-2d02-45a7-9f4a-d03bb3a6118a" containerName="pruner" Dec 03 14:16:26 crc kubenswrapper[4751]: E1203 14:16:26.427441 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9" containerName="pruner" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.427449 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9" containerName="pruner" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.427563 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="60859cef-2d02-45a7-9f4a-d03bb3a6118a" containerName="pruner" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.427586 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb1ce04-8c8a-4dad-b6dd-abdefd0d37c9" containerName="pruner" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.428033 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.432382 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.432595 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.436880 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.552035 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1066b597-509a-491e-b338-06a70d0a260d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1066b597-509a-491e-b338-06a70d0a260d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.552234 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1066b597-509a-491e-b338-06a70d0a260d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1066b597-509a-491e-b338-06a70d0a260d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.653958 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1066b597-509a-491e-b338-06a70d0a260d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1066b597-509a-491e-b338-06a70d0a260d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.654007 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1066b597-509a-491e-b338-06a70d0a260d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1066b597-509a-491e-b338-06a70d0a260d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.654101 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1066b597-509a-491e-b338-06a70d0a260d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1066b597-509a-491e-b338-06a70d0a260d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.672550 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1066b597-509a-491e-b338-06a70d0a260d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1066b597-509a-491e-b338-06a70d0a260d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.742992 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.944719 4751 generic.go:334] "Generic (PLEG): container finished" podID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerID="a1174ab7b11b05e34b38f2990fbd355a83fb7bf9fb96f06a5f6e7b170ff15496" exitCode=0 Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.944867 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drt67" event={"ID":"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b","Type":"ContainerDied","Data":"a1174ab7b11b05e34b38f2990fbd355a83fb7bf9fb96f06a5f6e7b170ff15496"} Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.946353 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zgqdp" event={"ID":"45fb8744-4cb9-4138-8310-c02f7c6a2941","Type":"ContainerStarted","Data":"7883152b2d9892cf381c461d46eb42670279acab1ea487890c2cc5ae305e0146"} Dec 03 14:16:26 crc kubenswrapper[4751]: I1203 14:16:26.979583 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zgqdp" podStartSLOduration=171.979562279 podStartE2EDuration="2m51.979562279s" podCreationTimestamp="2025-12-03 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:16:26.975538102 +0000 UTC m=+193.963893319" watchObservedRunningTime="2025-12-03 14:16:26.979562279 +0000 UTC m=+193.967917496" Dec 03 14:16:27 crc kubenswrapper[4751]: I1203 14:16:27.112312 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 14:16:27 crc kubenswrapper[4751]: I1203 14:16:27.954284 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drt67" event={"ID":"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b","Type":"ContainerStarted","Data":"44f230d8075d5006f3fede773b3971afa2a04cf1dba93f473d1e483a31aa3987"} Dec 03 14:16:27 crc kubenswrapper[4751]: I1203 14:16:27.969781 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1066b597-509a-491e-b338-06a70d0a260d","Type":"ContainerStarted","Data":"ff7663b3f865f8135a38caf19bda044a1470b9a88e9d68aaf745429c9675da32"} Dec 03 14:16:27 crc kubenswrapper[4751]: I1203 14:16:27.969819 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1066b597-509a-491e-b338-06a70d0a260d","Type":"ContainerStarted","Data":"871a67da2b5aad5b987796bfcfa9e990dd6c0f477e21b1621442d52486ea72cb"} Dec 03 14:16:27 crc kubenswrapper[4751]: I1203 14:16:27.975685 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-drt67" podStartSLOduration=3.363693036 podStartE2EDuration="36.975662976s" podCreationTimestamp="2025-12-03 14:15:51 +0000 UTC" firstStartedPulling="2025-12-03 14:15:53.732927881 +0000 UTC m=+160.721283118" lastFinishedPulling="2025-12-03 14:16:27.344897841 +0000 UTC m=+194.333253058" observedRunningTime="2025-12-03 14:16:27.971372392 +0000 UTC m=+194.959727619" watchObservedRunningTime="2025-12-03 14:16:27.975662976 +0000 UTC m=+194.964018203" Dec 03 14:16:28 crc kubenswrapper[4751]: I1203 14:16:28.981617 4751 generic.go:334] "Generic (PLEG): container finished" podID="1066b597-509a-491e-b338-06a70d0a260d" containerID="ff7663b3f865f8135a38caf19bda044a1470b9a88e9d68aaf745429c9675da32" exitCode=0 Dec 03 14:16:28 crc kubenswrapper[4751]: I1203 14:16:28.981699 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1066b597-509a-491e-b338-06a70d0a260d","Type":"ContainerDied","Data":"ff7663b3f865f8135a38caf19bda044a1470b9a88e9d68aaf745429c9675da32"} Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.238635 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.296807 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1066b597-509a-491e-b338-06a70d0a260d-kube-api-access\") pod \"1066b597-509a-491e-b338-06a70d0a260d\" (UID: \"1066b597-509a-491e-b338-06a70d0a260d\") " Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.297216 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1066b597-509a-491e-b338-06a70d0a260d-kubelet-dir\") pod \"1066b597-509a-491e-b338-06a70d0a260d\" (UID: \"1066b597-509a-491e-b338-06a70d0a260d\") " Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.297368 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1066b597-509a-491e-b338-06a70d0a260d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1066b597-509a-491e-b338-06a70d0a260d" (UID: "1066b597-509a-491e-b338-06a70d0a260d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.297700 4751 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1066b597-509a-491e-b338-06a70d0a260d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.306479 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1066b597-509a-491e-b338-06a70d0a260d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1066b597-509a-491e-b338-06a70d0a260d" (UID: "1066b597-509a-491e-b338-06a70d0a260d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.399140 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1066b597-509a-491e-b338-06a70d0a260d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.992139 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1066b597-509a-491e-b338-06a70d0a260d","Type":"ContainerDied","Data":"871a67da2b5aad5b987796bfcfa9e990dd6c0f477e21b1621442d52486ea72cb"} Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.992401 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="871a67da2b5aad5b987796bfcfa9e990dd6c0f477e21b1621442d52486ea72cb" Dec 03 14:16:30 crc kubenswrapper[4751]: I1203 14:16:30.992412 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:16:31 crc kubenswrapper[4751]: I1203 14:16:31.505225 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:16:31 crc kubenswrapper[4751]: I1203 14:16:31.505262 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:16:31 crc kubenswrapper[4751]: I1203 14:16:31.998497 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerID="ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4" exitCode=0 Dec 03 14:16:31 crc kubenswrapper[4751]: I1203 14:16:31.998541 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cclkx" event={"ID":"d5bfbffc-4818-4710-9c57-a2a4f298bfe2","Type":"ContainerDied","Data":"ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4"} Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.229265 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 14:16:32 crc kubenswrapper[4751]: E1203 14:16:32.229569 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1066b597-509a-491e-b338-06a70d0a260d" containerName="pruner" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.229591 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="1066b597-509a-491e-b338-06a70d0a260d" containerName="pruner" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.229711 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="1066b597-509a-491e-b338-06a70d0a260d" containerName="pruner" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.230154 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.236706 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.237131 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.237648 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.326891 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a05958-b823-4796-992b-caed2f6e8f2e-kube-api-access\") pod \"installer-9-crc\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.327217 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-var-lock\") pod \"installer-9-crc\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.327338 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.428995 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.429125 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a05958-b823-4796-992b-caed2f6e8f2e-kube-api-access\") pod \"installer-9-crc\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.429143 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.429173 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-var-lock\") pod \"installer-9-crc\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.429278 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-var-lock\") pod \"installer-9-crc\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.453149 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a05958-b823-4796-992b-caed2f6e8f2e-kube-api-access\") pod \"installer-9-crc\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.551281 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.617043 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-drt67" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerName="registry-server" probeResult="failure" output=< Dec 03 14:16:32 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 14:16:32 crc kubenswrapper[4751]: > Dec 03 14:16:32 crc kubenswrapper[4751]: I1203 14:16:32.764885 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 14:16:32 crc kubenswrapper[4751]: W1203 14:16:32.779989 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod54a05958_b823_4796_992b_caed2f6e8f2e.slice/crio-5625afa410b39ce6c68c6ea7c4cda7a14b12550dd7f0a021f3e9b336230c6af5 WatchSource:0}: Error finding container 5625afa410b39ce6c68c6ea7c4cda7a14b12550dd7f0a021f3e9b336230c6af5: Status 404 returned error can't find the container with id 5625afa410b39ce6c68c6ea7c4cda7a14b12550dd7f0a021f3e9b336230c6af5 Dec 03 14:16:33 crc kubenswrapper[4751]: I1203 14:16:33.005385 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"54a05958-b823-4796-992b-caed2f6e8f2e","Type":"ContainerStarted","Data":"5625afa410b39ce6c68c6ea7c4cda7a14b12550dd7f0a021f3e9b336230c6af5"} Dec 03 14:16:33 crc kubenswrapper[4751]: I1203 14:16:33.007640 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cclkx" event={"ID":"d5bfbffc-4818-4710-9c57-a2a4f298bfe2","Type":"ContainerStarted","Data":"d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b"} Dec 03 14:16:33 crc kubenswrapper[4751]: I1203 14:16:33.026676 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cclkx" podStartSLOduration=3.055275809 podStartE2EDuration="46.026657965s" podCreationTimestamp="2025-12-03 14:15:47 +0000 UTC" firstStartedPulling="2025-12-03 14:15:49.673182925 +0000 UTC m=+156.661538142" lastFinishedPulling="2025-12-03 14:16:32.644565081 +0000 UTC m=+199.632920298" observedRunningTime="2025-12-03 14:16:33.02362313 +0000 UTC m=+200.011978357" watchObservedRunningTime="2025-12-03 14:16:33.026657965 +0000 UTC m=+200.015013192" Dec 03 14:16:34 crc kubenswrapper[4751]: I1203 14:16:34.012793 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"54a05958-b823-4796-992b-caed2f6e8f2e","Type":"ContainerStarted","Data":"1f500512155ede48d6956451ab6010ec2559907278a7fea1ab0a624979bbdbd7"} Dec 03 14:16:34 crc kubenswrapper[4751]: I1203 14:16:34.033020 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.033002289 podStartE2EDuration="2.033002289s" podCreationTimestamp="2025-12-03 14:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:16:34.031500927 +0000 UTC m=+201.019856134" watchObservedRunningTime="2025-12-03 14:16:34.033002289 +0000 UTC m=+201.021357506" Dec 03 14:16:35 crc kubenswrapper[4751]: I1203 14:16:35.018512 4751 generic.go:334] "Generic (PLEG): container finished" podID="0e46a326-4633-4c4b-aac9-700b969ef961" containerID="2b90a3e8d0a954db4446523e5b98e26610b4e9979fb1ef0fe2626ce1fcf09e31" exitCode=0 Dec 03 14:16:35 crc kubenswrapper[4751]: I1203 14:16:35.018625 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqbc4" event={"ID":"0e46a326-4633-4c4b-aac9-700b969ef961","Type":"ContainerDied","Data":"2b90a3e8d0a954db4446523e5b98e26610b4e9979fb1ef0fe2626ce1fcf09e31"} Dec 03 14:16:35 crc kubenswrapper[4751]: I1203 14:16:35.820611 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:16:35 crc kubenswrapper[4751]: I1203 14:16:35.821108 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:16:36 crc kubenswrapper[4751]: I1203 14:16:36.027511 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqbc4" event={"ID":"0e46a326-4633-4c4b-aac9-700b969ef961","Type":"ContainerStarted","Data":"592b01200999177b82c61823380ccbfc9ea34d7b236ab81f5ecd0417467b344b"} Dec 03 14:16:36 crc kubenswrapper[4751]: I1203 14:16:36.335239 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qqbc4" podStartSLOduration=2.306015244 podStartE2EDuration="48.335220959s" podCreationTimestamp="2025-12-03 14:15:48 +0000 UTC" firstStartedPulling="2025-12-03 14:15:49.674844659 +0000 UTC m=+156.663199876" lastFinishedPulling="2025-12-03 14:16:35.704050374 +0000 UTC m=+202.692405591" observedRunningTime="2025-12-03 14:16:36.049968328 +0000 UTC m=+203.038323565" watchObservedRunningTime="2025-12-03 14:16:36.335220959 +0000 UTC m=+203.323576176" Dec 03 14:16:38 crc kubenswrapper[4751]: I1203 14:16:38.247046 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:16:38 crc kubenswrapper[4751]: I1203 14:16:38.247473 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:16:38 crc kubenswrapper[4751]: I1203 14:16:38.497494 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:16:38 crc kubenswrapper[4751]: I1203 14:16:38.659973 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:16:38 crc kubenswrapper[4751]: I1203 14:16:38.660207 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:16:38 crc kubenswrapper[4751]: I1203 14:16:38.702280 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:16:39 crc kubenswrapper[4751]: I1203 14:16:39.081515 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:16:41 crc kubenswrapper[4751]: I1203 14:16:41.052857 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jntcp" event={"ID":"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0","Type":"ContainerStarted","Data":"ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b"} Dec 03 14:16:41 crc kubenswrapper[4751]: I1203 14:16:41.055793 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g592f" event={"ID":"3004e0b4-ba13-417b-88ef-439481ef93f4","Type":"ContainerStarted","Data":"434bc487cb6c6417f8d34f6a784db7216ff45a145769439be76709a28e31e58f"} Dec 03 14:16:41 crc kubenswrapper[4751]: I1203 14:16:41.057309 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4brfr" event={"ID":"0878495c-85e0-469c-bee0-a4f6ce70d873","Type":"ContainerStarted","Data":"11a7139d121d1a44e21ec35a81095d27a1d4df0ac908d07c35775b30ff215556"} Dec 03 14:16:41 crc kubenswrapper[4751]: I1203 14:16:41.557537 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:16:41 crc kubenswrapper[4751]: I1203 14:16:41.602717 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:16:42 crc kubenswrapper[4751]: I1203 14:16:42.064650 4751 generic.go:334] "Generic (PLEG): container finished" podID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerID="ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b" exitCode=0 Dec 03 14:16:42 crc kubenswrapper[4751]: I1203 14:16:42.064724 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jntcp" event={"ID":"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0","Type":"ContainerDied","Data":"ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b"} Dec 03 14:16:42 crc kubenswrapper[4751]: I1203 14:16:42.069684 4751 generic.go:334] "Generic (PLEG): container finished" podID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerID="434bc487cb6c6417f8d34f6a784db7216ff45a145769439be76709a28e31e58f" exitCode=0 Dec 03 14:16:42 crc kubenswrapper[4751]: I1203 14:16:42.069748 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g592f" event={"ID":"3004e0b4-ba13-417b-88ef-439481ef93f4","Type":"ContainerDied","Data":"434bc487cb6c6417f8d34f6a784db7216ff45a145769439be76709a28e31e58f"} Dec 03 14:16:42 crc kubenswrapper[4751]: I1203 14:16:42.074940 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskvp" event={"ID":"b3328e22-4ad1-4815-b477-b015fc4dcf27","Type":"ContainerStarted","Data":"a7165d57db379a3deeea10e5b30558a5516336878b851d99b2063501ebdecc94"} Dec 03 14:16:42 crc kubenswrapper[4751]: I1203 14:16:42.077665 4751 generic.go:334] "Generic (PLEG): container finished" podID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerID="11a7139d121d1a44e21ec35a81095d27a1d4df0ac908d07c35775b30ff215556" exitCode=0 Dec 03 14:16:42 crc kubenswrapper[4751]: I1203 14:16:42.077754 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4brfr" event={"ID":"0878495c-85e0-469c-bee0-a4f6ce70d873","Type":"ContainerDied","Data":"11a7139d121d1a44e21ec35a81095d27a1d4df0ac908d07c35775b30ff215556"} Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.084664 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jntcp" event={"ID":"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0","Type":"ContainerStarted","Data":"b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287"} Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.087783 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g592f" event={"ID":"3004e0b4-ba13-417b-88ef-439481ef93f4","Type":"ContainerStarted","Data":"0c0b95b7954f47cbce7853213f96159cdbf157bb3be26611e4771f4d0ba36854"} Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.090641 4751 generic.go:334] "Generic (PLEG): container finished" podID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerID="a7165d57db379a3deeea10e5b30558a5516336878b851d99b2063501ebdecc94" exitCode=0 Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.090693 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskvp" event={"ID":"b3328e22-4ad1-4815-b477-b015fc4dcf27","Type":"ContainerDied","Data":"a7165d57db379a3deeea10e5b30558a5516336878b851d99b2063501ebdecc94"} Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.095424 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4brfr" event={"ID":"0878495c-85e0-469c-bee0-a4f6ce70d873","Type":"ContainerStarted","Data":"0aa5cff644d3e7537e9a4348683a7127f6836df79f8641d2cc2e4b7b58bd03f0"} Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.097274 4751 generic.go:334] "Generic (PLEG): container finished" podID="bddd0747-ddb2-40ec-8912-6941942460bc" containerID="4f912707255eb54b518fb0e81905e4dfff0a67c48b7b8a1bf835126d2465f545" exitCode=0 Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.097311 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrskc" event={"ID":"bddd0747-ddb2-40ec-8912-6941942460bc","Type":"ContainerDied","Data":"4f912707255eb54b518fb0e81905e4dfff0a67c48b7b8a1bf835126d2465f545"} Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.108982 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jntcp" podStartSLOduration=2.284587267 podStartE2EDuration="55.108962019s" podCreationTimestamp="2025-12-03 14:15:48 +0000 UTC" firstStartedPulling="2025-12-03 14:15:49.675168877 +0000 UTC m=+156.663524094" lastFinishedPulling="2025-12-03 14:16:42.499543629 +0000 UTC m=+209.487898846" observedRunningTime="2025-12-03 14:16:43.106201251 +0000 UTC m=+210.094556478" watchObservedRunningTime="2025-12-03 14:16:43.108962019 +0000 UTC m=+210.097317246" Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.122574 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g592f" podStartSLOduration=3.318081152 podStartE2EDuration="54.122557333s" podCreationTimestamp="2025-12-03 14:15:49 +0000 UTC" firstStartedPulling="2025-12-03 14:15:51.699837783 +0000 UTC m=+158.688192990" lastFinishedPulling="2025-12-03 14:16:42.504313954 +0000 UTC m=+209.492669171" observedRunningTime="2025-12-03 14:16:43.121110372 +0000 UTC m=+210.109465589" watchObservedRunningTime="2025-12-03 14:16:43.122557333 +0000 UTC m=+210.110912550" Dec 03 14:16:43 crc kubenswrapper[4751]: I1203 14:16:43.154675 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4brfr" podStartSLOduration=3.115744394 podStartE2EDuration="56.154659719s" podCreationTimestamp="2025-12-03 14:15:47 +0000 UTC" firstStartedPulling="2025-12-03 14:15:49.680794707 +0000 UTC m=+156.669149924" lastFinishedPulling="2025-12-03 14:16:42.719710032 +0000 UTC m=+209.708065249" observedRunningTime="2025-12-03 14:16:43.153464755 +0000 UTC m=+210.141819982" watchObservedRunningTime="2025-12-03 14:16:43.154659719 +0000 UTC m=+210.143014936" Dec 03 14:16:44 crc kubenswrapper[4751]: I1203 14:16:44.103624 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskvp" event={"ID":"b3328e22-4ad1-4815-b477-b015fc4dcf27","Type":"ContainerStarted","Data":"436715558f2d0b8530968160c90b614d18e6e462e186cd4aa8baac3423cd960b"} Dec 03 14:16:44 crc kubenswrapper[4751]: I1203 14:16:44.105831 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrskc" event={"ID":"bddd0747-ddb2-40ec-8912-6941942460bc","Type":"ContainerStarted","Data":"5b5d1a794184aa731d7b6a062356a46c59986cd1bcb8cd5ade8d8ea5b9b8797c"} Dec 03 14:16:44 crc kubenswrapper[4751]: I1203 14:16:44.138927 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lskvp" podStartSLOduration=3.184211795 podStartE2EDuration="53.13891196s" podCreationTimestamp="2025-12-03 14:15:51 +0000 UTC" firstStartedPulling="2025-12-03 14:15:53.723375287 +0000 UTC m=+160.711730504" lastFinishedPulling="2025-12-03 14:16:43.678075452 +0000 UTC m=+210.666430669" observedRunningTime="2025-12-03 14:16:44.136579014 +0000 UTC m=+211.124934241" watchObservedRunningTime="2025-12-03 14:16:44.13891196 +0000 UTC m=+211.127267177" Dec 03 14:16:44 crc kubenswrapper[4751]: I1203 14:16:44.159306 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qrskc" podStartSLOduration=3.225106567 podStartE2EDuration="54.159287105s" podCreationTimestamp="2025-12-03 14:15:50 +0000 UTC" firstStartedPulling="2025-12-03 14:15:52.712356264 +0000 UTC m=+159.700711491" lastFinishedPulling="2025-12-03 14:16:43.646536822 +0000 UTC m=+210.634892029" observedRunningTime="2025-12-03 14:16:44.158209614 +0000 UTC m=+211.146564831" watchObservedRunningTime="2025-12-03 14:16:44.159287105 +0000 UTC m=+211.147642322" Dec 03 14:16:48 crc kubenswrapper[4751]: I1203 14:16:48.055302 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:16:48 crc kubenswrapper[4751]: I1203 14:16:48.055877 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:16:48 crc kubenswrapper[4751]: I1203 14:16:48.093056 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:16:48 crc kubenswrapper[4751]: I1203 14:16:48.172213 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:16:48 crc kubenswrapper[4751]: I1203 14:16:48.467852 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:16:48 crc kubenswrapper[4751]: I1203 14:16:48.468160 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:16:48 crc kubenswrapper[4751]: I1203 14:16:48.504068 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:16:48 crc kubenswrapper[4751]: I1203 14:16:48.703997 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:16:49 crc kubenswrapper[4751]: I1203 14:16:49.168439 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.053385 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.054198 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.056465 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jntcp"] Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.095355 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.167495 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.457219 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.457271 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.496708 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.655913 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qqbc4"] Dec 03 14:16:50 crc kubenswrapper[4751]: I1203 14:16:50.656151 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qqbc4" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" containerName="registry-server" containerID="cri-o://592b01200999177b82c61823380ccbfc9ea34d7b236ab81f5ecd0417467b344b" gracePeriod=2 Dec 03 14:16:51 crc kubenswrapper[4751]: I1203 14:16:51.147565 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jntcp" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerName="registry-server" containerID="cri-o://b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287" gracePeriod=2 Dec 03 14:16:51 crc kubenswrapper[4751]: I1203 14:16:51.185631 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:16:51 crc kubenswrapper[4751]: I1203 14:16:51.878532 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:16:51 crc kubenswrapper[4751]: I1203 14:16:51.878757 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:16:51 crc kubenswrapper[4751]: I1203 14:16:51.933177 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.158895 4751 generic.go:334] "Generic (PLEG): container finished" podID="0e46a326-4633-4c4b-aac9-700b969ef961" containerID="592b01200999177b82c61823380ccbfc9ea34d7b236ab81f5ecd0417467b344b" exitCode=0 Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.159153 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqbc4" event={"ID":"0e46a326-4633-4c4b-aac9-700b969ef961","Type":"ContainerDied","Data":"592b01200999177b82c61823380ccbfc9ea34d7b236ab81f5ecd0417467b344b"} Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.217890 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.389445 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.498276 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcvlv\" (UniqueName: \"kubernetes.io/projected/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-kube-api-access-pcvlv\") pod \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.498368 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-catalog-content\") pod \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.498420 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-utilities\") pod \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\" (UID: \"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0\") " Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.499204 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-utilities" (OuterVolumeSpecName: "utilities") pod "b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" (UID: "b17b5e87-a2c1-446f-ac2f-52e7f6b608c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.505585 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-kube-api-access-pcvlv" (OuterVolumeSpecName: "kube-api-access-pcvlv") pod "b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" (UID: "b17b5e87-a2c1-446f-ac2f-52e7f6b608c0"). InnerVolumeSpecName "kube-api-access-pcvlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.600871 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcvlv\" (UniqueName: \"kubernetes.io/projected/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-kube-api-access-pcvlv\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:52 crc kubenswrapper[4751]: I1203 14:16:52.600945 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.048144 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" (UID: "b17b5e87-a2c1-446f-ac2f-52e7f6b608c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.067041 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrskc"] Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.107781 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.171123 4751 generic.go:334] "Generic (PLEG): container finished" podID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerID="b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287" exitCode=0 Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.171349 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qrskc" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" containerName="registry-server" containerID="cri-o://5b5d1a794184aa731d7b6a062356a46c59986cd1bcb8cd5ade8d8ea5b9b8797c" gracePeriod=2 Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.171521 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jntcp" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.171518 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jntcp" event={"ID":"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0","Type":"ContainerDied","Data":"b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287"} Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.171583 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jntcp" event={"ID":"b17b5e87-a2c1-446f-ac2f-52e7f6b608c0","Type":"ContainerDied","Data":"770c6be5652b14896860b43c989e9f28b12f1b668df6da7b91b2d2997b8162de"} Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.171611 4751 scope.go:117] "RemoveContainer" containerID="b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.195159 4751 scope.go:117] "RemoveContainer" containerID="ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.214107 4751 scope.go:117] "RemoveContainer" containerID="1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.225404 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jntcp"] Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.229014 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jntcp"] Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.246722 4751 scope.go:117] "RemoveContainer" containerID="b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287" Dec 03 14:16:53 crc kubenswrapper[4751]: E1203 14:16:53.247142 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287\": container with ID starting with b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287 not found: ID does not exist" containerID="b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.247175 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287"} err="failed to get container status \"b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287\": rpc error: code = NotFound desc = could not find container \"b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287\": container with ID starting with b19858cec76c494811fe74ffbba731d5a3552ec75f7c7f65c2275ea0a1d82287 not found: ID does not exist" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.247216 4751 scope.go:117] "RemoveContainer" containerID="ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b" Dec 03 14:16:53 crc kubenswrapper[4751]: E1203 14:16:53.247541 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b\": container with ID starting with ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b not found: ID does not exist" containerID="ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.247576 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b"} err="failed to get container status \"ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b\": rpc error: code = NotFound desc = could not find container \"ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b\": container with ID starting with ebfb078774a8f4198d9659431288dd96c867b7325ec3aff4f4a41cfcd253d52b not found: ID does not exist" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.247598 4751 scope.go:117] "RemoveContainer" containerID="1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f" Dec 03 14:16:53 crc kubenswrapper[4751]: E1203 14:16:53.247831 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f\": container with ID starting with 1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f not found: ID does not exist" containerID="1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.247856 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f"} err="failed to get container status \"1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f\": rpc error: code = NotFound desc = could not find container \"1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f\": container with ID starting with 1c3dd1aecf771d3ebbea8d1fdb8ebaefb4152bf809e7e0147856183757f7341f not found: ID does not exist" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.315575 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.320553 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" path="/var/lib/kubelet/pods/b17b5e87-a2c1-446f-ac2f-52e7f6b608c0/volumes" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.412611 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-utilities\") pod \"0e46a326-4633-4c4b-aac9-700b969ef961\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.412737 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-catalog-content\") pod \"0e46a326-4633-4c4b-aac9-700b969ef961\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.412774 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs6sf\" (UniqueName: \"kubernetes.io/projected/0e46a326-4633-4c4b-aac9-700b969ef961-kube-api-access-xs6sf\") pod \"0e46a326-4633-4c4b-aac9-700b969ef961\" (UID: \"0e46a326-4633-4c4b-aac9-700b969ef961\") " Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.414121 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-utilities" (OuterVolumeSpecName: "utilities") pod "0e46a326-4633-4c4b-aac9-700b969ef961" (UID: "0e46a326-4633-4c4b-aac9-700b969ef961"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.418189 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e46a326-4633-4c4b-aac9-700b969ef961-kube-api-access-xs6sf" (OuterVolumeSpecName: "kube-api-access-xs6sf") pod "0e46a326-4633-4c4b-aac9-700b969ef961" (UID: "0e46a326-4633-4c4b-aac9-700b969ef961"). InnerVolumeSpecName "kube-api-access-xs6sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.476411 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e46a326-4633-4c4b-aac9-700b969ef961" (UID: "0e46a326-4633-4c4b-aac9-700b969ef961"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.514465 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.514754 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs6sf\" (UniqueName: \"kubernetes.io/projected/0e46a326-4633-4c4b-aac9-700b969ef961-kube-api-access-xs6sf\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:53 crc kubenswrapper[4751]: I1203 14:16:53.514771 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e46a326-4633-4c4b-aac9-700b969ef961-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.180168 4751 generic.go:334] "Generic (PLEG): container finished" podID="bddd0747-ddb2-40ec-8912-6941942460bc" containerID="5b5d1a794184aa731d7b6a062356a46c59986cd1bcb8cd5ade8d8ea5b9b8797c" exitCode=0 Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.180215 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrskc" event={"ID":"bddd0747-ddb2-40ec-8912-6941942460bc","Type":"ContainerDied","Data":"5b5d1a794184aa731d7b6a062356a46c59986cd1bcb8cd5ade8d8ea5b9b8797c"} Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.182610 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqbc4" event={"ID":"0e46a326-4633-4c4b-aac9-700b969ef961","Type":"ContainerDied","Data":"ecfbecd04f4be53aede60a3b82b3cac2062d27163bfc4c4460fb329323de5fb0"} Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.182649 4751 scope.go:117] "RemoveContainer" containerID="592b01200999177b82c61823380ccbfc9ea34d7b236ab81f5ecd0417467b344b" Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.182652 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqbc4" Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.221213 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qqbc4"] Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.224151 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qqbc4"] Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.273466 4751 scope.go:117] "RemoveContainer" containerID="2b90a3e8d0a954db4446523e5b98e26610b4e9979fb1ef0fe2626ce1fcf09e31" Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.297395 4751 scope.go:117] "RemoveContainer" containerID="5143b34b8a2914f7369992f1349d25227c33d52373093041214c1e7198ba2b6f" Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.866670 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.933554 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bt5h\" (UniqueName: \"kubernetes.io/projected/bddd0747-ddb2-40ec-8912-6941942460bc-kube-api-access-4bt5h\") pod \"bddd0747-ddb2-40ec-8912-6941942460bc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.933672 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-catalog-content\") pod \"bddd0747-ddb2-40ec-8912-6941942460bc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.933699 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-utilities\") pod \"bddd0747-ddb2-40ec-8912-6941942460bc\" (UID: \"bddd0747-ddb2-40ec-8912-6941942460bc\") " Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.934673 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-utilities" (OuterVolumeSpecName: "utilities") pod "bddd0747-ddb2-40ec-8912-6941942460bc" (UID: "bddd0747-ddb2-40ec-8912-6941942460bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.940452 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bddd0747-ddb2-40ec-8912-6941942460bc-kube-api-access-4bt5h" (OuterVolumeSpecName: "kube-api-access-4bt5h") pod "bddd0747-ddb2-40ec-8912-6941942460bc" (UID: "bddd0747-ddb2-40ec-8912-6941942460bc"). InnerVolumeSpecName "kube-api-access-4bt5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:16:54 crc kubenswrapper[4751]: I1203 14:16:54.963839 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bddd0747-ddb2-40ec-8912-6941942460bc" (UID: "bddd0747-ddb2-40ec-8912-6941942460bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.035470 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bt5h\" (UniqueName: \"kubernetes.io/projected/bddd0747-ddb2-40ec-8912-6941942460bc-kube-api-access-4bt5h\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.035506 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.035516 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddd0747-ddb2-40ec-8912-6941942460bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.191585 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrskc" event={"ID":"bddd0747-ddb2-40ec-8912-6941942460bc","Type":"ContainerDied","Data":"76dde61231b543f2da26493e73588b522f5e9f49444224ef8e94503f7c5d4119"} Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.191642 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrskc" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.191647 4751 scope.go:117] "RemoveContainer" containerID="5b5d1a794184aa731d7b6a062356a46c59986cd1bcb8cd5ade8d8ea5b9b8797c" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.209322 4751 scope.go:117] "RemoveContainer" containerID="4f912707255eb54b518fb0e81905e4dfff0a67c48b7b8a1bf835126d2465f545" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.227516 4751 scope.go:117] "RemoveContainer" containerID="7b5ca8fe3de26b8a3d3bb6dc37734183a4406d5d4c30c0f1be2eecd5f4e9a5d1" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.229506 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrskc"] Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.232698 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrskc"] Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.322258 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" path="/var/lib/kubelet/pods/0e46a326-4633-4c4b-aac9-700b969ef961/volumes" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.323941 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" path="/var/lib/kubelet/pods/bddd0747-ddb2-40ec-8912-6941942460bc/volumes" Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.457860 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lskvp"] Dec 03 14:16:55 crc kubenswrapper[4751]: I1203 14:16:55.458352 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lskvp" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerName="registry-server" containerID="cri-o://436715558f2d0b8530968160c90b614d18e6e462e186cd4aa8baac3423cd960b" gracePeriod=2 Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.207151 4751 generic.go:334] "Generic (PLEG): container finished" podID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerID="436715558f2d0b8530968160c90b614d18e6e462e186cd4aa8baac3423cd960b" exitCode=0 Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.207270 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskvp" event={"ID":"b3328e22-4ad1-4815-b477-b015fc4dcf27","Type":"ContainerDied","Data":"436715558f2d0b8530968160c90b614d18e6e462e186cd4aa8baac3423cd960b"} Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.331386 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.451722 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shdfn\" (UniqueName: \"kubernetes.io/projected/b3328e22-4ad1-4815-b477-b015fc4dcf27-kube-api-access-shdfn\") pod \"b3328e22-4ad1-4815-b477-b015fc4dcf27\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.451783 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-utilities\") pod \"b3328e22-4ad1-4815-b477-b015fc4dcf27\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.451877 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-catalog-content\") pod \"b3328e22-4ad1-4815-b477-b015fc4dcf27\" (UID: \"b3328e22-4ad1-4815-b477-b015fc4dcf27\") " Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.452883 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-utilities" (OuterVolumeSpecName: "utilities") pod "b3328e22-4ad1-4815-b477-b015fc4dcf27" (UID: "b3328e22-4ad1-4815-b477-b015fc4dcf27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.461204 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3328e22-4ad1-4815-b477-b015fc4dcf27-kube-api-access-shdfn" (OuterVolumeSpecName: "kube-api-access-shdfn") pod "b3328e22-4ad1-4815-b477-b015fc4dcf27" (UID: "b3328e22-4ad1-4815-b477-b015fc4dcf27"). InnerVolumeSpecName "kube-api-access-shdfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.553649 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shdfn\" (UniqueName: \"kubernetes.io/projected/b3328e22-4ad1-4815-b477-b015fc4dcf27-kube-api-access-shdfn\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.554345 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.585999 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3328e22-4ad1-4815-b477-b015fc4dcf27" (UID: "b3328e22-4ad1-4815-b477-b015fc4dcf27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:16:56 crc kubenswrapper[4751]: I1203 14:16:56.655906 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3328e22-4ad1-4815-b477-b015fc4dcf27-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:16:57 crc kubenswrapper[4751]: I1203 14:16:57.217709 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskvp" event={"ID":"b3328e22-4ad1-4815-b477-b015fc4dcf27","Type":"ContainerDied","Data":"27b59928f94d9cb2a9a43438b54439dc305d03ceebfb0d40b595dbe49540cfed"} Dec 03 14:16:57 crc kubenswrapper[4751]: I1203 14:16:57.218203 4751 scope.go:117] "RemoveContainer" containerID="436715558f2d0b8530968160c90b614d18e6e462e186cd4aa8baac3423cd960b" Dec 03 14:16:57 crc kubenswrapper[4751]: I1203 14:16:57.217755 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lskvp" Dec 03 14:16:57 crc kubenswrapper[4751]: I1203 14:16:57.241950 4751 scope.go:117] "RemoveContainer" containerID="a7165d57db379a3deeea10e5b30558a5516336878b851d99b2063501ebdecc94" Dec 03 14:16:57 crc kubenswrapper[4751]: I1203 14:16:57.258462 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lskvp"] Dec 03 14:16:57 crc kubenswrapper[4751]: I1203 14:16:57.259398 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lskvp"] Dec 03 14:16:57 crc kubenswrapper[4751]: I1203 14:16:57.278304 4751 scope.go:117] "RemoveContainer" containerID="001a2ff547e688b3191e9c60fc2b6fae094ac608a5de278ed037761bbdd29d13" Dec 03 14:16:57 crc kubenswrapper[4751]: I1203 14:16:57.323112 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" path="/var/lib/kubelet/pods/b3328e22-4ad1-4815-b477-b015fc4dcf27/volumes" Dec 03 14:16:58 crc kubenswrapper[4751]: I1203 14:16:58.819987 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zcq45"] Dec 03 14:17:05 crc kubenswrapper[4751]: I1203 14:17:05.820283 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:17:05 crc kubenswrapper[4751]: I1203 14:17:05.820852 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:17:05 crc kubenswrapper[4751]: I1203 14:17:05.821272 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:17:05 crc kubenswrapper[4751]: I1203 14:17:05.821919 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:17:05 crc kubenswrapper[4751]: I1203 14:17:05.822008 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d" gracePeriod=600 Dec 03 14:17:06 crc kubenswrapper[4751]: I1203 14:17:06.280147 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d" exitCode=0 Dec 03 14:17:06 crc kubenswrapper[4751]: I1203 14:17:06.280208 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d"} Dec 03 14:17:06 crc kubenswrapper[4751]: I1203 14:17:06.280699 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"2297df34a8e0def51a3b8b80a4b2f09fb12dbe5d21891df62cf314ccbd2348b9"} Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.660690 4751 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661452 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerName="extract-content" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661472 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerName="extract-content" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661486 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" containerName="extract-utilities" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661495 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" containerName="extract-utilities" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661506 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" containerName="extract-utilities" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661513 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" containerName="extract-utilities" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661527 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerName="extract-utilities" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661535 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerName="extract-utilities" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661547 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerName="extract-utilities" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661555 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerName="extract-utilities" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661565 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" containerName="extract-content" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661572 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" containerName="extract-content" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661580 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661587 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661598 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661605 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661615 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661622 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661634 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661641 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661653 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerName="extract-content" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661661 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerName="extract-content" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.661706 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" containerName="extract-content" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661716 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" containerName="extract-content" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661823 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3328e22-4ad1-4815-b477-b015fc4dcf27" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661841 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="bddd0747-ddb2-40ec-8912-6941942460bc" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661849 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e46a326-4633-4c4b-aac9-700b969ef961" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.661861 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b17b5e87-a2c1-446f-ac2f-52e7f6b608c0" containerName="registry-server" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.662313 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.663157 4751 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.663447 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2" gracePeriod=15 Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.663507 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30" gracePeriod=15 Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.663520 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38" gracePeriod=15 Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.663467 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459" gracePeriod=15 Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.663505 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0" gracePeriod=15 Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.663864 4751 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.663982 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.663996 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.664006 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664013 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.664020 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664026 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.664034 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664039 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.664051 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664057 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.664064 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664070 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 14:17:10 crc kubenswrapper[4751]: E1203 14:17:10.664081 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664087 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664160 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664168 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664176 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664183 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664190 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.664199 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.739438 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.739500 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.739530 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.739567 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.739690 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.739734 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.739782 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.739816 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.841435 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.841803 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.841836 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.841860 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.841941 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.841983 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.842010 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.842046 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.842126 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.841598 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.842193 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.842223 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.842251 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.842281 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.842312 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:10 crc kubenswrapper[4751]: I1203 14:17:10.842381 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.322804 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.325129 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.325721 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0" exitCode=0 Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.325760 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2" exitCode=0 Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.325770 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459" exitCode=0 Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.325780 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38" exitCode=2 Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.325844 4751 scope.go:117] "RemoveContainer" containerID="c160ce71eb4bf9b824cb087e235c49606f02ef09cf931262a731eb54eec7e5a3" Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.329512 4751 generic.go:334] "Generic (PLEG): container finished" podID="54a05958-b823-4796-992b-caed2f6e8f2e" containerID="1f500512155ede48d6956451ab6010ec2559907278a7fea1ab0a624979bbdbd7" exitCode=0 Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.329545 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"54a05958-b823-4796-992b-caed2f6e8f2e","Type":"ContainerDied","Data":"1f500512155ede48d6956451ab6010ec2559907278a7fea1ab0a624979bbdbd7"} Dec 03 14:17:11 crc kubenswrapper[4751]: I1203 14:17:11.330151 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.340395 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.588682 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.589453 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.666299 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-kubelet-dir\") pod \"54a05958-b823-4796-992b-caed2f6e8f2e\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.666414 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a05958-b823-4796-992b-caed2f6e8f2e-kube-api-access\") pod \"54a05958-b823-4796-992b-caed2f6e8f2e\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.666449 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "54a05958-b823-4796-992b-caed2f6e8f2e" (UID: "54a05958-b823-4796-992b-caed2f6e8f2e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.666468 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-var-lock\") pod \"54a05958-b823-4796-992b-caed2f6e8f2e\" (UID: \"54a05958-b823-4796-992b-caed2f6e8f2e\") " Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.666511 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-var-lock" (OuterVolumeSpecName: "var-lock") pod "54a05958-b823-4796-992b-caed2f6e8f2e" (UID: "54a05958-b823-4796-992b-caed2f6e8f2e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.667082 4751 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.667110 4751 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54a05958-b823-4796-992b-caed2f6e8f2e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.675408 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a05958-b823-4796-992b-caed2f6e8f2e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "54a05958-b823-4796-992b-caed2f6e8f2e" (UID: "54a05958-b823-4796-992b-caed2f6e8f2e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:17:12 crc kubenswrapper[4751]: I1203 14:17:12.768267 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54a05958-b823-4796-992b-caed2f6e8f2e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.035781 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.037035 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.037543 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.037768 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.071442 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.071545 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.071562 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.071576 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.071612 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.071638 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.072046 4751 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.072064 4751 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.072073 4751 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.321494 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.321866 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.325249 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.364439 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30" exitCode=0 Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.364582 4751 scope.go:117] "RemoveContainer" containerID="bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.364743 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.366010 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.366382 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.368852 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"54a05958-b823-4796-992b-caed2f6e8f2e","Type":"ContainerDied","Data":"5625afa410b39ce6c68c6ea7c4cda7a14b12550dd7f0a021f3e9b336230c6af5"} Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.368899 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5625afa410b39ce6c68c6ea7c4cda7a14b12550dd7f0a021f3e9b336230c6af5" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.368958 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.369877 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.370198 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: E1203 14:17:13.370368 4751 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/kube-apiserver-crc_openshift-kube-apiserver_kube-apiserver-check-endpoints-bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0.log: no such file or directory" path="/var/log/containers/kube-apiserver-crc_openshift-kube-apiserver_kube-apiserver-check-endpoints-bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0.log" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.375290 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.376012 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.387729 4751 scope.go:117] "RemoveContainer" containerID="d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.404699 4751 scope.go:117] "RemoveContainer" containerID="471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.424957 4751 scope.go:117] "RemoveContainer" containerID="0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.441635 4751 scope.go:117] "RemoveContainer" containerID="bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.464971 4751 scope.go:117] "RemoveContainer" containerID="9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.483834 4751 scope.go:117] "RemoveContainer" containerID="bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0" Dec 03 14:17:13 crc kubenswrapper[4751]: E1203 14:17:13.484518 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\": container with ID starting with bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0 not found: ID does not exist" containerID="bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.484579 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0"} err="failed to get container status \"bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\": rpc error: code = NotFound desc = could not find container \"bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0\": container with ID starting with bb81af86440abbf1582a83215077729dc944dba9aac385128d091289dfbd6bd0 not found: ID does not exist" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.484695 4751 scope.go:117] "RemoveContainer" containerID="d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2" Dec 03 14:17:13 crc kubenswrapper[4751]: E1203 14:17:13.485163 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\": container with ID starting with d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2 not found: ID does not exist" containerID="d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.485228 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2"} err="failed to get container status \"d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\": rpc error: code = NotFound desc = could not find container \"d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2\": container with ID starting with d7ac77e241e7f3041c246954e1df76562e17ad9afb82a164004ccb948e99faa2 not found: ID does not exist" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.485270 4751 scope.go:117] "RemoveContainer" containerID="471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459" Dec 03 14:17:13 crc kubenswrapper[4751]: E1203 14:17:13.485629 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\": container with ID starting with 471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459 not found: ID does not exist" containerID="471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.485776 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459"} err="failed to get container status \"471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\": rpc error: code = NotFound desc = could not find container \"471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459\": container with ID starting with 471f848e3fb6cf16ed11aaa3f06f358f541b5ffa57e9236491efd60b7be40459 not found: ID does not exist" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.485805 4751 scope.go:117] "RemoveContainer" containerID="0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38" Dec 03 14:17:13 crc kubenswrapper[4751]: E1203 14:17:13.486124 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\": container with ID starting with 0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38 not found: ID does not exist" containerID="0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.486164 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38"} err="failed to get container status \"0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\": rpc error: code = NotFound desc = could not find container \"0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38\": container with ID starting with 0d11bbc30f7d5a965fc27d9069bfa279363847a19fee54a39f6ad00183a10a38 not found: ID does not exist" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.486190 4751 scope.go:117] "RemoveContainer" containerID="bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30" Dec 03 14:17:13 crc kubenswrapper[4751]: E1203 14:17:13.486481 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\": container with ID starting with bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30 not found: ID does not exist" containerID="bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.486510 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30"} err="failed to get container status \"bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\": rpc error: code = NotFound desc = could not find container \"bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30\": container with ID starting with bc3dc707ccc79e932dd74ff73ffec8120da9881adea5f5dd04c6375587105c30 not found: ID does not exist" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.486527 4751 scope.go:117] "RemoveContainer" containerID="9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f" Dec 03 14:17:13 crc kubenswrapper[4751]: E1203 14:17:13.486864 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\": container with ID starting with 9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f not found: ID does not exist" containerID="9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f" Dec 03 14:17:13 crc kubenswrapper[4751]: I1203 14:17:13.486912 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f"} err="failed to get container status \"9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\": rpc error: code = NotFound desc = could not find container \"9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f\": container with ID starting with 9c167c5065810c80da4a102711c6e6a96c33a06558c2e7d3477b67676c250c2f not found: ID does not exist" Dec 03 14:17:15 crc kubenswrapper[4751]: E1203 14:17:15.690791 4751 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:15 crc kubenswrapper[4751]: I1203 14:17:15.691875 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:15 crc kubenswrapper[4751]: W1203 14:17:15.712690 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-ca2a88cdc3846c30cdb0434883f450ea6c7deb22361f02b0415ed42f68f03b9c WatchSource:0}: Error finding container ca2a88cdc3846c30cdb0434883f450ea6c7deb22361f02b0415ed42f68f03b9c: Status 404 returned error can't find the container with id ca2a88cdc3846c30cdb0434883f450ea6c7deb22361f02b0415ed42f68f03b9c Dec 03 14:17:15 crc kubenswrapper[4751]: E1203 14:17:15.718694 4751 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dba4547acec12 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 14:17:15.717127186 +0000 UTC m=+242.705482413,LastTimestamp:2025-12-03 14:17:15.717127186 +0000 UTC m=+242.705482413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 14:17:16 crc kubenswrapper[4751]: I1203 14:17:16.390450 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"77d124d518cecdfb40a83bbec6688e64a24862cac5c97e2daec9417367e88820"} Dec 03 14:17:16 crc kubenswrapper[4751]: I1203 14:17:16.390784 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ca2a88cdc3846c30cdb0434883f450ea6c7deb22361f02b0415ed42f68f03b9c"} Dec 03 14:17:16 crc kubenswrapper[4751]: I1203 14:17:16.391393 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:16 crc kubenswrapper[4751]: E1203 14:17:16.391423 4751 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:17:18 crc kubenswrapper[4751]: E1203 14:17:18.058718 4751 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dba4547acec12 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 14:17:15.717127186 +0000 UTC m=+242.705482413,LastTimestamp:2025-12-03 14:17:15.717127186 +0000 UTC m=+242.705482413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 14:17:19 crc kubenswrapper[4751]: E1203 14:17:19.145211 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:19 crc kubenswrapper[4751]: E1203 14:17:19.146784 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:19 crc kubenswrapper[4751]: E1203 14:17:19.152201 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:19 crc kubenswrapper[4751]: E1203 14:17:19.152848 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:19 crc kubenswrapper[4751]: E1203 14:17:19.153412 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:19 crc kubenswrapper[4751]: I1203 14:17:19.153478 4751 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 14:17:19 crc kubenswrapper[4751]: E1203 14:17:19.153912 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 03 14:17:19 crc kubenswrapper[4751]: E1203 14:17:19.355485 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 03 14:17:19 crc kubenswrapper[4751]: E1203 14:17:19.756478 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 03 14:17:20 crc kubenswrapper[4751]: E1203 14:17:20.557491 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 03 14:17:22 crc kubenswrapper[4751]: E1203 14:17:22.161024 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="3.2s" Dec 03 14:17:23 crc kubenswrapper[4751]: I1203 14:17:23.318621 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:23 crc kubenswrapper[4751]: I1203 14:17:23.318652 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:23 crc kubenswrapper[4751]: I1203 14:17:23.319720 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:23 crc kubenswrapper[4751]: I1203 14:17:23.343809 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:23 crc kubenswrapper[4751]: I1203 14:17:23.343845 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:23 crc kubenswrapper[4751]: E1203 14:17:23.344425 4751 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:23 crc kubenswrapper[4751]: I1203 14:17:23.345130 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:23 crc kubenswrapper[4751]: I1203 14:17:23.848891 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7c4241ac08be77ad1f84cf92c85dc9e8abdfa573a739dff63755ba42acea76eb"} Dec 03 14:17:23 crc kubenswrapper[4751]: I1203 14:17:23.855568 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" containerName="oauth-openshift" containerID="cri-o://ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed" gracePeriod=15 Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.176068 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.177100 4751 status_manager.go:851] "Failed to get status for pod" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zcq45\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.177595 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.276914 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-router-certs\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277018 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277057 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-ocp-branding-template\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277141 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-login\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277181 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-dir\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277216 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-policies\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277274 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-service-ca\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277299 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277313 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277462 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-session\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277505 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-idp-0-file-data\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277527 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr7rd\" (UniqueName: \"kubernetes.io/projected/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-kube-api-access-kr7rd\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277597 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-serving-cert\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277618 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-trusted-ca-bundle\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277655 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-error\") pod \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\" (UID: \"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3\") " Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.277971 4751 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.278189 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.278569 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.278593 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.279503 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.282743 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.282836 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.282881 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-kube-api-access-kr7rd" (OuterVolumeSpecName: "kube-api-access-kr7rd") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "kube-api-access-kr7rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.283381 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.283415 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.283626 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.283755 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.283818 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.284049 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" (UID: "dd4311c3-0b8c-4ad2-8b36-1bc543c188d3"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378500 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378536 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378549 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378564 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr7rd\" (UniqueName: \"kubernetes.io/projected/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-kube-api-access-kr7rd\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378575 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378585 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378595 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378610 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378621 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378631 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378642 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378652 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.378662 4751 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.860056 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.860143 4751 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a" exitCode=1 Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.860200 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a"} Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.861050 4751 scope.go:117] "RemoveContainer" containerID="68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.861254 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.861672 4751 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.862104 4751 status_manager.go:851] "Failed to get status for pod" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zcq45\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.862636 4751 generic.go:334] "Generic (PLEG): container finished" podID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" containerID="ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed" exitCode=0 Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.862695 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.862725 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" event={"ID":"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3","Type":"ContainerDied","Data":"ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed"} Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.862767 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" event={"ID":"dd4311c3-0b8c-4ad2-8b36-1bc543c188d3","Type":"ContainerDied","Data":"9646b47a11ac942ef798f097d6c8ea0f91ad706a71db0c1cc0b89a51271300a2"} Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.862793 4751 scope.go:117] "RemoveContainer" containerID="ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.865052 4751 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="0f99a0a62f28ee7d95211777d0d34b1e07ecfc0f1a34555f382248a908960fc6" exitCode=0 Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.865063 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.865104 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"0f99a0a62f28ee7d95211777d0d34b1e07ecfc0f1a34555f382248a908960fc6"} Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.866129 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.866166 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.866646 4751 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: E1203 14:17:24.866990 4751 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.867172 4751 status_manager.go:851] "Failed to get status for pod" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zcq45\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.867851 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.868649 4751 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.868992 4751 status_manager.go:851] "Failed to get status for pod" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zcq45\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.887601 4751 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.888409 4751 status_manager.go:851] "Failed to get status for pod" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" pod="openshift-authentication/oauth-openshift-558db77b4-zcq45" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zcq45\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.888930 4751 status_manager.go:851] "Failed to get status for pod" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.898932 4751 scope.go:117] "RemoveContainer" containerID="ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed" Dec 03 14:17:24 crc kubenswrapper[4751]: E1203 14:17:24.900059 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed\": container with ID starting with ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed not found: ID does not exist" containerID="ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed" Dec 03 14:17:24 crc kubenswrapper[4751]: I1203 14:17:24.900098 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed"} err="failed to get container status \"ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed\": rpc error: code = NotFound desc = could not find container \"ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed\": container with ID starting with ba7ff5d0ec2deb05e75392ed9e9bc9d8532cc403301ced860ecc0e505e4101ed not found: ID does not exist" Dec 03 14:17:25 crc kubenswrapper[4751]: I1203 14:17:25.872274 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8597eed5cf9db297307f6583ca709de98017458f7b5dc1b1ef9b18867ac63400"} Dec 03 14:17:25 crc kubenswrapper[4751]: I1203 14:17:25.872629 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2e11f93138d2c0948c3648e562de7e30088cf504c37603a728accd20134d6f3c"} Dec 03 14:17:25 crc kubenswrapper[4751]: I1203 14:17:25.872649 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bfc48d5e0a13ced1e6a4939a1d2aa7b93a03f6cd3bbca0aa9bec088fe5e05c7c"} Dec 03 14:17:25 crc kubenswrapper[4751]: I1203 14:17:25.872662 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3097cea7f4a99344e5096c9864c067b0f14056fa9b9fde314e45eec8f33b5958"} Dec 03 14:17:25 crc kubenswrapper[4751]: I1203 14:17:25.876293 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 14:17:25 crc kubenswrapper[4751]: I1203 14:17:25.876380 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b2c72f448233567212d225809e4e369559a32cc82f42a0597f815c654c956fc6"} Dec 03 14:17:26 crc kubenswrapper[4751]: I1203 14:17:26.122080 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:17:26 crc kubenswrapper[4751]: I1203 14:17:26.122387 4751 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 14:17:26 crc kubenswrapper[4751]: I1203 14:17:26.122549 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 14:17:26 crc kubenswrapper[4751]: I1203 14:17:26.885375 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"eba7d181834fe2a9eb1917057eb19b55d316f5e5009f4847cb37e29b23c9f722"} Dec 03 14:17:26 crc kubenswrapper[4751]: I1203 14:17:26.886224 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:26 crc kubenswrapper[4751]: I1203 14:17:26.886489 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:26 crc kubenswrapper[4751]: I1203 14:17:26.886608 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:28 crc kubenswrapper[4751]: I1203 14:17:28.346178 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:28 crc kubenswrapper[4751]: I1203 14:17:28.346529 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:28 crc kubenswrapper[4751]: I1203 14:17:28.351652 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:29 crc kubenswrapper[4751]: I1203 14:17:29.646294 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:17:31 crc kubenswrapper[4751]: I1203 14:17:31.895601 4751 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:31 crc kubenswrapper[4751]: I1203 14:17:31.913097 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:31 crc kubenswrapper[4751]: I1203 14:17:31.913136 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:31 crc kubenswrapper[4751]: I1203 14:17:31.916540 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:32 crc kubenswrapper[4751]: I1203 14:17:32.917492 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:32 crc kubenswrapper[4751]: I1203 14:17:32.917521 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:33 crc kubenswrapper[4751]: I1203 14:17:33.338567 4751 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c8752f35-e726-4cd9-8e01-154cb49cc9fc" Dec 03 14:17:36 crc kubenswrapper[4751]: I1203 14:17:36.122116 4751 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 14:17:36 crc kubenswrapper[4751]: I1203 14:17:36.122479 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 14:17:41 crc kubenswrapper[4751]: I1203 14:17:41.620319 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 14:17:41 crc kubenswrapper[4751]: I1203 14:17:41.989176 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 14:17:42 crc kubenswrapper[4751]: I1203 14:17:42.241479 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 14:17:42 crc kubenswrapper[4751]: I1203 14:17:42.345860 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 14:17:42 crc kubenswrapper[4751]: I1203 14:17:42.737523 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 14:17:42 crc kubenswrapper[4751]: I1203 14:17:42.737522 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 14:17:42 crc kubenswrapper[4751]: I1203 14:17:42.841752 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 14:17:42 crc kubenswrapper[4751]: I1203 14:17:42.868041 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 14:17:42 crc kubenswrapper[4751]: I1203 14:17:42.973873 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.307449 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.320846 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.337096 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.341166 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.420460 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.791575 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.825738 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.930146 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.940353 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 14:17:43 crc kubenswrapper[4751]: I1203 14:17:43.983522 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 14:17:44 crc kubenswrapper[4751]: I1203 14:17:44.132073 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 14:17:44 crc kubenswrapper[4751]: I1203 14:17:44.387216 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 14:17:44 crc kubenswrapper[4751]: I1203 14:17:44.574221 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 14:17:44 crc kubenswrapper[4751]: I1203 14:17:44.575754 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 14:17:44 crc kubenswrapper[4751]: I1203 14:17:44.739259 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 14:17:44 crc kubenswrapper[4751]: I1203 14:17:44.817785 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 14:17:44 crc kubenswrapper[4751]: I1203 14:17:44.874242 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 14:17:44 crc kubenswrapper[4751]: I1203 14:17:44.963733 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 14:17:44 crc kubenswrapper[4751]: I1203 14:17:44.970697 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.040672 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.044293 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.301555 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.304535 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.338316 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.350870 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.354773 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.384935 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.468951 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.510405 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.570574 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.571933 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.641271 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.641659 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.743703 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.745780 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.868984 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.871156 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.967503 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 14:17:45 crc kubenswrapper[4751]: I1203 14:17:45.977951 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.017536 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.018522 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.121921 4751 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.121969 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.122014 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.122515 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b2c72f448233567212d225809e4e369559a32cc82f42a0597f815c654c956fc6"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.122633 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://b2c72f448233567212d225809e4e369559a32cc82f42a0597f815c654c956fc6" gracePeriod=30 Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.210526 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.328536 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.461819 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.556498 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.569627 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.589009 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.616939 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.668631 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.725086 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.788233 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.800413 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.863549 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.901178 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.938371 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.982735 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 14:17:46 crc kubenswrapper[4751]: I1203 14:17:46.985433 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.019834 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.039130 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.082877 4751 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.147382 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.182209 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.182760 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.217312 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.425948 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.435844 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.456669 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.702312 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.738630 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.851856 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.884671 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 14:17:47 crc kubenswrapper[4751]: I1203 14:17:47.906696 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.071572 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.170851 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.185848 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.328679 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.355594 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.451987 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.701887 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.836491 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.850434 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.882031 4751 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.938058 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.963007 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 14:17:48 crc kubenswrapper[4751]: I1203 14:17:48.997201 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.050151 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.136808 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.194159 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.197414 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.197593 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.288529 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.300760 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.354750 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.384529 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.440780 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.457164 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.594542 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.685502 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.705852 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.744184 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.837391 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.838629 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.866584 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 14:17:49 crc kubenswrapper[4751]: I1203 14:17:49.901020 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.255286 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.270890 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.300851 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.446777 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.454945 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.465795 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.476740 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.527091 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.636490 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.655151 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.694545 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.720713 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.827691 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.884635 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.903919 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.929137 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 14:17:50 crc kubenswrapper[4751]: I1203 14:17:50.976058 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.007235 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.046743 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.060818 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.085116 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.139626 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.140052 4751 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.165697 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.460981 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.470618 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.769824 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.772221 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.847064 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.863600 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.886615 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 14:17:51 crc kubenswrapper[4751]: I1203 14:17:51.980997 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.014242 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.050909 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.066483 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.181730 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.241111 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.286068 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.299216 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.430692 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.447209 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.460850 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.477175 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.486744 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.584424 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.585709 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.602183 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.628792 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.706219 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.729240 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.784629 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.884434 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.937363 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 14:17:52 crc kubenswrapper[4751]: I1203 14:17:52.939322 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.095653 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.137989 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.156742 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.348046 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.394348 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.442713 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.549917 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.632182 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.666451 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.793043 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.954724 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 14:17:53 crc kubenswrapper[4751]: I1203 14:17:53.967166 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.003720 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.043782 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.059936 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.083236 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.083722 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.478924 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.479919 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.490927 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.578675 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.671989 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.750645 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.795998 4751 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.803740 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.864022 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.880904 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.893178 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 14:17:54 crc kubenswrapper[4751]: I1203 14:17:54.981361 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.088046 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.141979 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.277645 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.424780 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.493825 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.831478 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.860159 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.875315 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.891296 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.894784 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.899349 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 14:17:55 crc kubenswrapper[4751]: I1203 14:17:55.956614 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.161366 4751 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.168058 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-zcq45"] Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.168146 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-cb86fb758-675vf"] Dec 03 14:17:56 crc kubenswrapper[4751]: E1203 14:17:56.168439 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" containerName="oauth-openshift" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.168467 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" containerName="oauth-openshift" Dec 03 14:17:56 crc kubenswrapper[4751]: E1203 14:17:56.168480 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" containerName="installer" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.168491 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" containerName="installer" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.168865 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.168927 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="088623b3-b683-467a-88db-985af50fb182" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.168877 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a05958-b823-4796-992b-caed2f6e8f2e" containerName="installer" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.169027 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" containerName="oauth-openshift" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.169546 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.173739 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.175617 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.175852 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.180116 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.180190 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.180209 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.180484 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.180479 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.180572 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.180763 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.180850 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.180994 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.181013 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.186760 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.191449 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.192827 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.200643 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.240826 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.240804529 podStartE2EDuration="25.240804529s" podCreationTimestamp="2025-12-03 14:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:17:56.221659984 +0000 UTC m=+283.210015211" watchObservedRunningTime="2025-12-03 14:17:56.240804529 +0000 UTC m=+283.229159746" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.306517 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-session\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.306881 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-template-error\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.307017 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.307182 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-audit-dir\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.307304 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-service-ca\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.307443 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-router-certs\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.307523 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.307601 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2gp9\" (UniqueName: \"kubernetes.io/projected/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-kube-api-access-d2gp9\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.307696 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-template-login\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.307781 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.307933 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.308047 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.308135 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-audit-policies\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.308224 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.325647 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.347558 4751 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.409837 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.409889 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-audit-policies\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.409912 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.409937 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-session\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.409967 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-template-error\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.409984 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410005 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-audit-dir\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410025 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-service-ca\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410046 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-router-certs\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410063 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410081 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2gp9\" (UniqueName: \"kubernetes.io/projected/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-kube-api-access-d2gp9\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410099 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-template-login\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410131 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410155 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410198 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-audit-dir\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410843 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-audit-policies\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.410939 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-service-ca\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.411298 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.411475 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.415225 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.416025 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.416410 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.416851 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.417614 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-session\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.417635 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.417600 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-system-router-certs\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.417728 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-template-error\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.419208 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-v4-0-config-user-template-login\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.427469 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2gp9\" (UniqueName: \"kubernetes.io/projected/11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1-kube-api-access-d2gp9\") pod \"oauth-openshift-cb86fb758-675vf\" (UID: \"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1\") " pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.430781 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.472828 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.499270 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.522974 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.637355 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.673321 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.712498 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-cb86fb758-675vf"] Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.736140 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.750032 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.842063 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.847612 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.850832 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.919362 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 14:17:56 crc kubenswrapper[4751]: I1203 14:17:56.966028 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 14:17:57 crc kubenswrapper[4751]: I1203 14:17:57.062021 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" event={"ID":"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1","Type":"ContainerStarted","Data":"b63ee7304b4c95407a76f32f78a05d085a83c5f0c4c26ccc29361033e3fe6a48"} Dec 03 14:17:57 crc kubenswrapper[4751]: I1203 14:17:57.062064 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" event={"ID":"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1","Type":"ContainerStarted","Data":"44cf53d2721bd7ad0241f1cbcc4ec3c0d199d1ab611ab3bc4ed862dfd8509033"} Dec 03 14:17:57 crc kubenswrapper[4751]: I1203 14:17:57.084756 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" podStartSLOduration=59.084740518 podStartE2EDuration="59.084740518s" podCreationTimestamp="2025-12-03 14:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:17:57.084152551 +0000 UTC m=+284.072507768" watchObservedRunningTime="2025-12-03 14:17:57.084740518 +0000 UTC m=+284.073095725" Dec 03 14:17:57 crc kubenswrapper[4751]: I1203 14:17:57.089943 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 14:17:57 crc kubenswrapper[4751]: I1203 14:17:57.266624 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 14:17:57 crc kubenswrapper[4751]: I1203 14:17:57.328258 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4311c3-0b8c-4ad2-8b36-1bc543c188d3" path="/var/lib/kubelet/pods/dd4311c3-0b8c-4ad2-8b36-1bc543c188d3/volumes" Dec 03 14:17:57 crc kubenswrapper[4751]: I1203 14:17:57.448961 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 14:17:57 crc kubenswrapper[4751]: I1203 14:17:57.476886 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 14:17:57 crc kubenswrapper[4751]: I1203 14:17:57.722402 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 14:17:58 crc kubenswrapper[4751]: I1203 14:17:58.068626 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb86fb758-675vf_11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1/oauth-openshift/0.log" Dec 03 14:17:58 crc kubenswrapper[4751]: I1203 14:17:58.068671 4751 generic.go:334] "Generic (PLEG): container finished" podID="11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1" containerID="b63ee7304b4c95407a76f32f78a05d085a83c5f0c4c26ccc29361033e3fe6a48" exitCode=255 Dec 03 14:17:58 crc kubenswrapper[4751]: I1203 14:17:58.068695 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" event={"ID":"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1","Type":"ContainerDied","Data":"b63ee7304b4c95407a76f32f78a05d085a83c5f0c4c26ccc29361033e3fe6a48"} Dec 03 14:17:58 crc kubenswrapper[4751]: I1203 14:17:58.069101 4751 scope.go:117] "RemoveContainer" containerID="b63ee7304b4c95407a76f32f78a05d085a83c5f0c4c26ccc29361033e3fe6a48" Dec 03 14:17:58 crc kubenswrapper[4751]: I1203 14:17:58.151852 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 14:17:58 crc kubenswrapper[4751]: I1203 14:17:58.174947 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 14:17:58 crc kubenswrapper[4751]: I1203 14:17:58.457280 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 14:17:58 crc kubenswrapper[4751]: I1203 14:17:58.685639 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 14:17:58 crc kubenswrapper[4751]: I1203 14:17:58.829098 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 14:17:59 crc kubenswrapper[4751]: I1203 14:17:59.075640 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb86fb758-675vf_11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1/oauth-openshift/0.log" Dec 03 14:17:59 crc kubenswrapper[4751]: I1203 14:17:59.075696 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" event={"ID":"11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1","Type":"ContainerStarted","Data":"d2f54e793653748c604858ae4627e51d9fd7e9193198b771b32e76c1e65d3505"} Dec 03 14:17:59 crc kubenswrapper[4751]: I1203 14:17:59.076008 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:17:59 crc kubenswrapper[4751]: I1203 14:17:59.102966 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" Dec 03 14:18:00 crc kubenswrapper[4751]: I1203 14:18:00.067791 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 14:18:05 crc kubenswrapper[4751]: I1203 14:18:05.820947 4751 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 14:18:05 crc kubenswrapper[4751]: I1203 14:18:05.821587 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://77d124d518cecdfb40a83bbec6688e64a24862cac5c97e2daec9417367e88820" gracePeriod=5 Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.159075 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.159868 4751 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="77d124d518cecdfb40a83bbec6688e64a24862cac5c97e2daec9417367e88820" exitCode=137 Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.404443 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.404535 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.507462 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.507542 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.507588 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.507642 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.507659 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.507709 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.507731 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.507762 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.507864 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.508147 4751 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.508176 4751 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.508193 4751 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.508209 4751 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.533533 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:18:11 crc kubenswrapper[4751]: I1203 14:18:11.609339 4751 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:12 crc kubenswrapper[4751]: I1203 14:18:12.166671 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 14:18:12 crc kubenswrapper[4751]: I1203 14:18:12.167004 4751 scope.go:117] "RemoveContainer" containerID="77d124d518cecdfb40a83bbec6688e64a24862cac5c97e2daec9417367e88820" Dec 03 14:18:12 crc kubenswrapper[4751]: I1203 14:18:12.167043 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:18:13 crc kubenswrapper[4751]: I1203 14:18:13.319648 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 14:18:15 crc kubenswrapper[4751]: I1203 14:18:15.905281 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 14:18:17 crc kubenswrapper[4751]: I1203 14:18:17.195102 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 03 14:18:17 crc kubenswrapper[4751]: I1203 14:18:17.197278 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 14:18:17 crc kubenswrapper[4751]: I1203 14:18:17.197350 4751 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b2c72f448233567212d225809e4e369559a32cc82f42a0597f815c654c956fc6" exitCode=137 Dec 03 14:18:17 crc kubenswrapper[4751]: I1203 14:18:17.197385 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b2c72f448233567212d225809e4e369559a32cc82f42a0597f815c654c956fc6"} Dec 03 14:18:17 crc kubenswrapper[4751]: I1203 14:18:17.197415 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"77d49b793a0bbcc88ec0d68b879c40f84c85f3443b31de826bffb957126e3419"} Dec 03 14:18:17 crc kubenswrapper[4751]: I1203 14:18:17.197437 4751 scope.go:117] "RemoveContainer" containerID="68bddf32d417f3fc396f40680f25a670ba34919626b25e7ecba02abfa449e29a" Dec 03 14:18:18 crc kubenswrapper[4751]: I1203 14:18:18.203854 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 03 14:18:18 crc kubenswrapper[4751]: I1203 14:18:18.207589 4751 generic.go:334] "Generic (PLEG): container finished" podID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerID="1a06f5749c9cd888a697aabdd9031e2bdd7b07c914015e34c8f3ca2faef21b16" exitCode=0 Dec 03 14:18:18 crc kubenswrapper[4751]: I1203 14:18:18.207647 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" event={"ID":"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02","Type":"ContainerDied","Data":"1a06f5749c9cd888a697aabdd9031e2bdd7b07c914015e34c8f3ca2faef21b16"} Dec 03 14:18:18 crc kubenswrapper[4751]: I1203 14:18:18.208147 4751 scope.go:117] "RemoveContainer" containerID="1a06f5749c9cd888a697aabdd9031e2bdd7b07c914015e34c8f3ca2faef21b16" Dec 03 14:18:19 crc kubenswrapper[4751]: I1203 14:18:19.214224 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" event={"ID":"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02","Type":"ContainerStarted","Data":"ff72ca347a6bdd6dca4c8bf3015a982ee374ece74b229bf189f2317c39c2a6a4"} Dec 03 14:18:19 crc kubenswrapper[4751]: I1203 14:18:19.214843 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:18:19 crc kubenswrapper[4751]: I1203 14:18:19.217188 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:18:19 crc kubenswrapper[4751]: I1203 14:18:19.646256 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:18:26 crc kubenswrapper[4751]: I1203 14:18:26.122512 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:18:26 crc kubenswrapper[4751]: I1203 14:18:26.126314 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:18:26 crc kubenswrapper[4751]: I1203 14:18:26.250044 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:18:58 crc kubenswrapper[4751]: I1203 14:18:58.402671 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv"] Dec 03 14:18:58 crc kubenswrapper[4751]: I1203 14:18:58.403448 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" podUID="3e9f6fd3-3c79-48a8-a551-128f73f63dd7" containerName="route-controller-manager" containerID="cri-o://e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272" gracePeriod=30 Dec 03 14:18:58 crc kubenswrapper[4751]: I1203 14:18:58.408304 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fwp52"] Dec 03 14:18:58 crc kubenswrapper[4751]: I1203 14:18:58.408540 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" podUID="eb5bca62-c4b9-4e79-a752-96185f22b757" containerName="controller-manager" containerID="cri-o://544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24" gracePeriod=30 Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.279927 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.283058 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.399705 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-config\") pod \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.399755 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hcbf\" (UniqueName: \"kubernetes.io/projected/eb5bca62-c4b9-4e79-a752-96185f22b757-kube-api-access-9hcbf\") pod \"eb5bca62-c4b9-4e79-a752-96185f22b757\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.399797 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-config\") pod \"eb5bca62-c4b9-4e79-a752-96185f22b757\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.399839 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb5bca62-c4b9-4e79-a752-96185f22b757-serving-cert\") pod \"eb5bca62-c4b9-4e79-a752-96185f22b757\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.399858 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5lsn\" (UniqueName: \"kubernetes.io/projected/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-kube-api-access-c5lsn\") pod \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.399884 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-serving-cert\") pod \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.399925 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-client-ca\") pod \"eb5bca62-c4b9-4e79-a752-96185f22b757\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.399944 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-client-ca\") pod \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\" (UID: \"3e9f6fd3-3c79-48a8-a551-128f73f63dd7\") " Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.399965 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-proxy-ca-bundles\") pod \"eb5bca62-c4b9-4e79-a752-96185f22b757\" (UID: \"eb5bca62-c4b9-4e79-a752-96185f22b757\") " Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.400515 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-client-ca" (OuterVolumeSpecName: "client-ca") pod "3e9f6fd3-3c79-48a8-a551-128f73f63dd7" (UID: "3e9f6fd3-3c79-48a8-a551-128f73f63dd7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.400620 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-config" (OuterVolumeSpecName: "config") pod "eb5bca62-c4b9-4e79-a752-96185f22b757" (UID: "eb5bca62-c4b9-4e79-a752-96185f22b757"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.400653 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eb5bca62-c4b9-4e79-a752-96185f22b757" (UID: "eb5bca62-c4b9-4e79-a752-96185f22b757"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.400661 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-client-ca" (OuterVolumeSpecName: "client-ca") pod "eb5bca62-c4b9-4e79-a752-96185f22b757" (UID: "eb5bca62-c4b9-4e79-a752-96185f22b757"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.400973 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.400990 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.400998 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.401006 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb5bca62-c4b9-4e79-a752-96185f22b757-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.401116 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-config" (OuterVolumeSpecName: "config") pod "3e9f6fd3-3c79-48a8-a551-128f73f63dd7" (UID: "3e9f6fd3-3c79-48a8-a551-128f73f63dd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.405301 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb5bca62-c4b9-4e79-a752-96185f22b757-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eb5bca62-c4b9-4e79-a752-96185f22b757" (UID: "eb5bca62-c4b9-4e79-a752-96185f22b757"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.405362 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3e9f6fd3-3c79-48a8-a551-128f73f63dd7" (UID: "3e9f6fd3-3c79-48a8-a551-128f73f63dd7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.405546 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb5bca62-c4b9-4e79-a752-96185f22b757-kube-api-access-9hcbf" (OuterVolumeSpecName: "kube-api-access-9hcbf") pod "eb5bca62-c4b9-4e79-a752-96185f22b757" (UID: "eb5bca62-c4b9-4e79-a752-96185f22b757"). InnerVolumeSpecName "kube-api-access-9hcbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.405845 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-kube-api-access-c5lsn" (OuterVolumeSpecName: "kube-api-access-c5lsn") pod "3e9f6fd3-3c79-48a8-a551-128f73f63dd7" (UID: "3e9f6fd3-3c79-48a8-a551-128f73f63dd7"). InnerVolumeSpecName "kube-api-access-c5lsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.420679 4751 generic.go:334] "Generic (PLEG): container finished" podID="eb5bca62-c4b9-4e79-a752-96185f22b757" containerID="544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24" exitCode=0 Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.420758 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" event={"ID":"eb5bca62-c4b9-4e79-a752-96185f22b757","Type":"ContainerDied","Data":"544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24"} Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.420789 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" event={"ID":"eb5bca62-c4b9-4e79-a752-96185f22b757","Type":"ContainerDied","Data":"34f1745ad1eba8e9f3cc55ce490ef1dd5671919d0e20f0341be439d40672dd22"} Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.420782 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fwp52" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.420824 4751 scope.go:117] "RemoveContainer" containerID="544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.424083 4751 generic.go:334] "Generic (PLEG): container finished" podID="3e9f6fd3-3c79-48a8-a551-128f73f63dd7" containerID="e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272" exitCode=0 Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.424117 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" event={"ID":"3e9f6fd3-3c79-48a8-a551-128f73f63dd7","Type":"ContainerDied","Data":"e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272"} Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.424143 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" event={"ID":"3e9f6fd3-3c79-48a8-a551-128f73f63dd7","Type":"ContainerDied","Data":"6e8fb136d482a795e7e602bed6bc4a0dced61168b27a6f8fba3b021bae94b64d"} Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.424166 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.440538 4751 scope.go:117] "RemoveContainer" containerID="544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24" Dec 03 14:18:59 crc kubenswrapper[4751]: E1203 14:18:59.445614 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24\": container with ID starting with 544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24 not found: ID does not exist" containerID="544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.445654 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24"} err="failed to get container status \"544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24\": rpc error: code = NotFound desc = could not find container \"544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24\": container with ID starting with 544b8f3f81d4877cabd5ff3867da444a505bb6b92b6c579e573e978edcc8ab24 not found: ID does not exist" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.445679 4751 scope.go:117] "RemoveContainer" containerID="e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.457692 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fwp52"] Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.464861 4751 scope.go:117] "RemoveContainer" containerID="e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272" Dec 03 14:18:59 crc kubenswrapper[4751]: E1203 14:18:59.465269 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272\": container with ID starting with e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272 not found: ID does not exist" containerID="e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.465306 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272"} err="failed to get container status \"e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272\": rpc error: code = NotFound desc = could not find container \"e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272\": container with ID starting with e0d52ed0c81222d9781a490c31e31b1cb55dd692764a3706c2eb8d1ad0fe1272 not found: ID does not exist" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.465374 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fwp52"] Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.468726 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv"] Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.471847 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pqqnv"] Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.501938 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.501967 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hcbf\" (UniqueName: \"kubernetes.io/projected/eb5bca62-c4b9-4e79-a752-96185f22b757-kube-api-access-9hcbf\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.501979 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb5bca62-c4b9-4e79-a752-96185f22b757-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.501989 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5lsn\" (UniqueName: \"kubernetes.io/projected/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-kube-api-access-c5lsn\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.501998 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9f6fd3-3c79-48a8-a551-128f73f63dd7-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.587132 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76444f977-5h7x2"] Dec 03 14:18:59 crc kubenswrapper[4751]: E1203 14:18:59.587359 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5bca62-c4b9-4e79-a752-96185f22b757" containerName="controller-manager" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.587372 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5bca62-c4b9-4e79-a752-96185f22b757" containerName="controller-manager" Dec 03 14:18:59 crc kubenswrapper[4751]: E1203 14:18:59.587381 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9f6fd3-3c79-48a8-a551-128f73f63dd7" containerName="route-controller-manager" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.587387 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9f6fd3-3c79-48a8-a551-128f73f63dd7" containerName="route-controller-manager" Dec 03 14:18:59 crc kubenswrapper[4751]: E1203 14:18:59.587400 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.587406 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.587509 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb5bca62-c4b9-4e79-a752-96185f22b757" containerName="controller-manager" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.587521 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9f6fd3-3c79-48a8-a551-128f73f63dd7" containerName="route-controller-manager" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.587534 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.587863 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.590236 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.590575 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.591017 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.591521 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.592842 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.593997 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq"] Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.594744 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.598283 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.605476 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.605501 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.605745 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.605845 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.606380 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.607544 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.608247 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.609559 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq"] Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.613945 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76444f977-5h7x2"] Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.690694 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76444f977-5h7x2"] Dec 03 14:18:59 crc kubenswrapper[4751]: E1203 14:18:59.691094 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-r9hpx proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" podUID="f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.703863 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk9dg\" (UniqueName: \"kubernetes.io/projected/5ec17178-9313-4102-9f0e-4697b5403499-kube-api-access-pk9dg\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.703910 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ec17178-9313-4102-9f0e-4697b5403499-serving-cert\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.703943 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-proxy-ca-bundles\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.703969 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-config\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.703990 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9hpx\" (UniqueName: \"kubernetes.io/projected/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-kube-api-access-r9hpx\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.704090 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-serving-cert\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.704110 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-client-ca\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.704133 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ec17178-9313-4102-9f0e-4697b5403499-client-ca\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.704154 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec17178-9313-4102-9f0e-4697b5403499-config\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.805361 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-proxy-ca-bundles\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.805412 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-config\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.805434 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9hpx\" (UniqueName: \"kubernetes.io/projected/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-kube-api-access-r9hpx\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.805472 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-serving-cert\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.805492 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-client-ca\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.805517 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ec17178-9313-4102-9f0e-4697b5403499-client-ca\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.805538 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec17178-9313-4102-9f0e-4697b5403499-config\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.805560 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ec17178-9313-4102-9f0e-4697b5403499-serving-cert\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.805576 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk9dg\" (UniqueName: \"kubernetes.io/projected/5ec17178-9313-4102-9f0e-4697b5403499-kube-api-access-pk9dg\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.806590 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ec17178-9313-4102-9f0e-4697b5403499-client-ca\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.807015 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec17178-9313-4102-9f0e-4697b5403499-config\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.807099 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-client-ca\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.807354 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-config\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.807936 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-proxy-ca-bundles\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.809583 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-serving-cert\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.810268 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ec17178-9313-4102-9f0e-4697b5403499-serving-cert\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.863918 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9hpx\" (UniqueName: \"kubernetes.io/projected/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-kube-api-access-r9hpx\") pod \"controller-manager-76444f977-5h7x2\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.864124 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk9dg\" (UniqueName: \"kubernetes.io/projected/5ec17178-9313-4102-9f0e-4697b5403499-kube-api-access-pk9dg\") pod \"route-controller-manager-598f7f77f5-t6xgq\" (UID: \"5ec17178-9313-4102-9f0e-4697b5403499\") " pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:18:59 crc kubenswrapper[4751]: I1203 14:18:59.911432 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.104552 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq"] Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.430667 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" event={"ID":"5ec17178-9313-4102-9f0e-4697b5403499","Type":"ContainerStarted","Data":"5ce26468d03b7e686b81489c14ddf6f0ff5fe0323fe30e86196cc21b7a82055a"} Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.430741 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" event={"ID":"5ec17178-9313-4102-9f0e-4697b5403499","Type":"ContainerStarted","Data":"36d071c2bb217cbe9e86158ec84b35538f6178268d6e3c9927adca449f493f49"} Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.430766 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.430691 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.438139 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.550377 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.567207 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" podStartSLOduration=1.567188397 podStartE2EDuration="1.567188397s" podCreationTimestamp="2025-12-03 14:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:19:00.451026718 +0000 UTC m=+347.439381955" watchObservedRunningTime="2025-12-03 14:19:00.567188397 +0000 UTC m=+347.555543614" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.612936 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-serving-cert\") pod \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.613020 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9hpx\" (UniqueName: \"kubernetes.io/projected/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-kube-api-access-r9hpx\") pod \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.613065 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-config\") pod \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.613161 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-proxy-ca-bundles\") pod \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.613188 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-client-ca\") pod \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\" (UID: \"f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8\") " Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.614840 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8" (UID: "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.615021 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-config" (OuterVolumeSpecName: "config") pod "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8" (UID: "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.615236 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8" (UID: "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.620536 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-kube-api-access-r9hpx" (OuterVolumeSpecName: "kube-api-access-r9hpx") pod "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8" (UID: "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8"). InnerVolumeSpecName "kube-api-access-r9hpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.621487 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8" (UID: "f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.714460 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.714502 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.714513 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.714528 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9hpx\" (UniqueName: \"kubernetes.io/projected/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-kube-api-access-r9hpx\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[4751]: I1203 14:19:00.714543 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.320490 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e9f6fd3-3c79-48a8-a551-128f73f63dd7" path="/var/lib/kubelet/pods/3e9f6fd3-3c79-48a8-a551-128f73f63dd7/volumes" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.321113 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb5bca62-c4b9-4e79-a752-96185f22b757" path="/var/lib/kubelet/pods/eb5bca62-c4b9-4e79-a752-96185f22b757/volumes" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.435790 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76444f977-5h7x2" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.462664 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dd96d466b-d2w9w"] Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.463523 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.465415 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.466340 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.466411 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.466515 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.466351 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.466630 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.468611 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76444f977-5h7x2"] Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.475766 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.482177 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76444f977-5h7x2"] Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.486108 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dd96d466b-d2w9w"] Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.623798 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xsj8\" (UniqueName: \"kubernetes.io/projected/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-kube-api-access-7xsj8\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.623915 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-client-ca\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.623999 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-serving-cert\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.624690 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-proxy-ca-bundles\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.624835 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-config\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.726460 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-client-ca\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.726538 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-serving-cert\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.726583 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-proxy-ca-bundles\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.726618 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-config\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.726653 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xsj8\" (UniqueName: \"kubernetes.io/projected/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-kube-api-access-7xsj8\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.728143 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-proxy-ca-bundles\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.728316 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-config\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.728512 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-client-ca\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.731118 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-serving-cert\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.743116 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xsj8\" (UniqueName: \"kubernetes.io/projected/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-kube-api-access-7xsj8\") pod \"controller-manager-6dd96d466b-d2w9w\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.778284 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:01 crc kubenswrapper[4751]: I1203 14:19:01.949587 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dd96d466b-d2w9w"] Dec 03 14:19:01 crc kubenswrapper[4751]: W1203 14:19:01.959516 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c4c4cf2_939f_425a_a7b6_892bd1c9fade.slice/crio-d6613c64adda78873f07fabdaf7f1c2b08c0101d3b725c092f7703e8b2fb0645 WatchSource:0}: Error finding container d6613c64adda78873f07fabdaf7f1c2b08c0101d3b725c092f7703e8b2fb0645: Status 404 returned error can't find the container with id d6613c64adda78873f07fabdaf7f1c2b08c0101d3b725c092f7703e8b2fb0645 Dec 03 14:19:02 crc kubenswrapper[4751]: I1203 14:19:02.441071 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" event={"ID":"3c4c4cf2-939f-425a-a7b6-892bd1c9fade","Type":"ContainerStarted","Data":"1e5db3017457b95dc44ff0da0d7de666b6694b556ad04e4141d4339b06c88900"} Dec 03 14:19:02 crc kubenswrapper[4751]: I1203 14:19:02.441111 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" event={"ID":"3c4c4cf2-939f-425a-a7b6-892bd1c9fade","Type":"ContainerStarted","Data":"d6613c64adda78873f07fabdaf7f1c2b08c0101d3b725c092f7703e8b2fb0645"} Dec 03 14:19:02 crc kubenswrapper[4751]: I1203 14:19:02.482257 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" podStartSLOduration=3.482239124 podStartE2EDuration="3.482239124s" podCreationTimestamp="2025-12-03 14:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:19:02.480404873 +0000 UTC m=+349.468760100" watchObservedRunningTime="2025-12-03 14:19:02.482239124 +0000 UTC m=+349.470594341" Dec 03 14:19:03 crc kubenswrapper[4751]: I1203 14:19:03.323155 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8" path="/var/lib/kubelet/pods/f0e66d32-ed8b-4f9e-8dea-bbdbacd5a3b8/volumes" Dec 03 14:19:03 crc kubenswrapper[4751]: I1203 14:19:03.446585 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:03 crc kubenswrapper[4751]: I1203 14:19:03.451518 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:18 crc kubenswrapper[4751]: I1203 14:19:18.354973 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dd96d466b-d2w9w"] Dec 03 14:19:18 crc kubenswrapper[4751]: I1203 14:19:18.355888 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" podUID="3c4c4cf2-939f-425a-a7b6-892bd1c9fade" containerName="controller-manager" containerID="cri-o://1e5db3017457b95dc44ff0da0d7de666b6694b556ad04e4141d4339b06c88900" gracePeriod=30 Dec 03 14:19:18 crc kubenswrapper[4751]: I1203 14:19:18.529508 4751 generic.go:334] "Generic (PLEG): container finished" podID="3c4c4cf2-939f-425a-a7b6-892bd1c9fade" containerID="1e5db3017457b95dc44ff0da0d7de666b6694b556ad04e4141d4339b06c88900" exitCode=0 Dec 03 14:19:18 crc kubenswrapper[4751]: I1203 14:19:18.529547 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" event={"ID":"3c4c4cf2-939f-425a-a7b6-892bd1c9fade","Type":"ContainerDied","Data":"1e5db3017457b95dc44ff0da0d7de666b6694b556ad04e4141d4339b06c88900"} Dec 03 14:19:18 crc kubenswrapper[4751]: I1203 14:19:18.908602 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.032721 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-client-ca\") pod \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.032773 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-proxy-ca-bundles\") pod \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.032790 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-config\") pod \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.032821 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-serving-cert\") pod \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.032855 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xsj8\" (UniqueName: \"kubernetes.io/projected/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-kube-api-access-7xsj8\") pod \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\" (UID: \"3c4c4cf2-939f-425a-a7b6-892bd1c9fade\") " Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.034227 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c4c4cf2-939f-425a-a7b6-892bd1c9fade" (UID: "3c4c4cf2-939f-425a-a7b6-892bd1c9fade"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.034217 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3c4c4cf2-939f-425a-a7b6-892bd1c9fade" (UID: "3c4c4cf2-939f-425a-a7b6-892bd1c9fade"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.034309 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-config" (OuterVolumeSpecName: "config") pod "3c4c4cf2-939f-425a-a7b6-892bd1c9fade" (UID: "3c4c4cf2-939f-425a-a7b6-892bd1c9fade"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.038492 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-kube-api-access-7xsj8" (OuterVolumeSpecName: "kube-api-access-7xsj8") pod "3c4c4cf2-939f-425a-a7b6-892bd1c9fade" (UID: "3c4c4cf2-939f-425a-a7b6-892bd1c9fade"). InnerVolumeSpecName "kube-api-access-7xsj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.040475 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c4c4cf2-939f-425a-a7b6-892bd1c9fade" (UID: "3c4c4cf2-939f-425a-a7b6-892bd1c9fade"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.134706 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.134741 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xsj8\" (UniqueName: \"kubernetes.io/projected/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-kube-api-access-7xsj8\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.134754 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.134763 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.134772 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4c4cf2-939f-425a-a7b6-892bd1c9fade-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.536934 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" event={"ID":"3c4c4cf2-939f-425a-a7b6-892bd1c9fade","Type":"ContainerDied","Data":"d6613c64adda78873f07fabdaf7f1c2b08c0101d3b725c092f7703e8b2fb0645"} Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.536998 4751 scope.go:117] "RemoveContainer" containerID="1e5db3017457b95dc44ff0da0d7de666b6694b556ad04e4141d4339b06c88900" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.537037 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd96d466b-d2w9w" Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.564237 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dd96d466b-d2w9w"] Dec 03 14:19:19 crc kubenswrapper[4751]: I1203 14:19:19.568434 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dd96d466b-d2w9w"] Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.345839 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76444f977-mgk4d"] Dec 03 14:19:20 crc kubenswrapper[4751]: E1203 14:19:20.346271 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4c4cf2-939f-425a-a7b6-892bd1c9fade" containerName="controller-manager" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.346282 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4c4cf2-939f-425a-a7b6-892bd1c9fade" containerName="controller-manager" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.346386 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4c4cf2-939f-425a-a7b6-892bd1c9fade" containerName="controller-manager" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.346712 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.349116 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.350443 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.350729 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.350771 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.351098 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.354660 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.361433 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.380066 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76444f977-mgk4d"] Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.451492 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f65b20-49c0-49cf-8b14-fea827c5a3d9-config\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.451655 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f65b20-49c0-49cf-8b14-fea827c5a3d9-client-ca\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.451723 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f65b20-49c0-49cf-8b14-fea827c5a3d9-serving-cert\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.451767 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44f65b20-49c0-49cf-8b14-fea827c5a3d9-proxy-ca-bundles\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.451814 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qdb5\" (UniqueName: \"kubernetes.io/projected/44f65b20-49c0-49cf-8b14-fea827c5a3d9-kube-api-access-6qdb5\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.553644 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f65b20-49c0-49cf-8b14-fea827c5a3d9-client-ca\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.553750 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f65b20-49c0-49cf-8b14-fea827c5a3d9-serving-cert\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.553790 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44f65b20-49c0-49cf-8b14-fea827c5a3d9-proxy-ca-bundles\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.553835 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qdb5\" (UniqueName: \"kubernetes.io/projected/44f65b20-49c0-49cf-8b14-fea827c5a3d9-kube-api-access-6qdb5\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.553922 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f65b20-49c0-49cf-8b14-fea827c5a3d9-config\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.555155 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f65b20-49c0-49cf-8b14-fea827c5a3d9-client-ca\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.556607 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44f65b20-49c0-49cf-8b14-fea827c5a3d9-proxy-ca-bundles\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.557176 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f65b20-49c0-49cf-8b14-fea827c5a3d9-config\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.559319 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f65b20-49c0-49cf-8b14-fea827c5a3d9-serving-cert\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.572849 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qdb5\" (UniqueName: \"kubernetes.io/projected/44f65b20-49c0-49cf-8b14-fea827c5a3d9-kube-api-access-6qdb5\") pod \"controller-manager-76444f977-mgk4d\" (UID: \"44f65b20-49c0-49cf-8b14-fea827c5a3d9\") " pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.673722 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:20 crc kubenswrapper[4751]: I1203 14:19:20.932867 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76444f977-mgk4d"] Dec 03 14:19:20 crc kubenswrapper[4751]: W1203 14:19:20.944420 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f65b20_49c0_49cf_8b14_fea827c5a3d9.slice/crio-4620aa40f8db6747accdcd7a27dcf75e7b29470d48f489774eaf581a59c7970e WatchSource:0}: Error finding container 4620aa40f8db6747accdcd7a27dcf75e7b29470d48f489774eaf581a59c7970e: Status 404 returned error can't find the container with id 4620aa40f8db6747accdcd7a27dcf75e7b29470d48f489774eaf581a59c7970e Dec 03 14:19:21 crc kubenswrapper[4751]: I1203 14:19:21.320024 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c4c4cf2-939f-425a-a7b6-892bd1c9fade" path="/var/lib/kubelet/pods/3c4c4cf2-939f-425a-a7b6-892bd1c9fade/volumes" Dec 03 14:19:21 crc kubenswrapper[4751]: I1203 14:19:21.549127 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" event={"ID":"44f65b20-49c0-49cf-8b14-fea827c5a3d9","Type":"ContainerStarted","Data":"973eaec64cafd8fa7f19a95b7d0d45343e074b99fcadcd11f49b0eae23922c61"} Dec 03 14:19:21 crc kubenswrapper[4751]: I1203 14:19:21.549466 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" event={"ID":"44f65b20-49c0-49cf-8b14-fea827c5a3d9","Type":"ContainerStarted","Data":"4620aa40f8db6747accdcd7a27dcf75e7b29470d48f489774eaf581a59c7970e"} Dec 03 14:19:21 crc kubenswrapper[4751]: I1203 14:19:21.549827 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:21 crc kubenswrapper[4751]: I1203 14:19:21.554035 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" Dec 03 14:19:21 crc kubenswrapper[4751]: I1203 14:19:21.564728 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" podStartSLOduration=3.564708796 podStartE2EDuration="3.564708796s" podCreationTimestamp="2025-12-03 14:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:19:21.563871113 +0000 UTC m=+368.552226330" watchObservedRunningTime="2025-12-03 14:19:21.564708796 +0000 UTC m=+368.553064013" Dec 03 14:19:35 crc kubenswrapper[4751]: I1203 14:19:35.820061 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:19:35 crc kubenswrapper[4751]: I1203 14:19:35.820650 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:19:40 crc kubenswrapper[4751]: I1203 14:19:40.932322 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b5n6q"] Dec 03 14:19:40 crc kubenswrapper[4751]: I1203 14:19:40.933873 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:40 crc kubenswrapper[4751]: I1203 14:19:40.957567 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b5n6q"] Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.028968 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9968046d-e1f9-4644-811b-45e9638d2ed4-registry-tls\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.029023 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9968046d-e1f9-4644-811b-45e9638d2ed4-registry-certificates\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.029039 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9968046d-e1f9-4644-811b-45e9638d2ed4-bound-sa-token\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.029071 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9968046d-e1f9-4644-811b-45e9638d2ed4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.029200 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.029337 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9968046d-e1f9-4644-811b-45e9638d2ed4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.029431 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9968046d-e1f9-4644-811b-45e9638d2ed4-trusted-ca\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.029485 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6nvr\" (UniqueName: \"kubernetes.io/projected/9968046d-e1f9-4644-811b-45e9638d2ed4-kube-api-access-g6nvr\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.052202 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.130724 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9968046d-e1f9-4644-811b-45e9638d2ed4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.130780 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9968046d-e1f9-4644-811b-45e9638d2ed4-trusted-ca\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.130814 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6nvr\" (UniqueName: \"kubernetes.io/projected/9968046d-e1f9-4644-811b-45e9638d2ed4-kube-api-access-g6nvr\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.130855 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9968046d-e1f9-4644-811b-45e9638d2ed4-registry-tls\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.130891 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9968046d-e1f9-4644-811b-45e9638d2ed4-registry-certificates\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.130911 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9968046d-e1f9-4644-811b-45e9638d2ed4-bound-sa-token\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.130937 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9968046d-e1f9-4644-811b-45e9638d2ed4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.131149 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9968046d-e1f9-4644-811b-45e9638d2ed4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.132898 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9968046d-e1f9-4644-811b-45e9638d2ed4-trusted-ca\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.133106 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9968046d-e1f9-4644-811b-45e9638d2ed4-registry-certificates\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.136881 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9968046d-e1f9-4644-811b-45e9638d2ed4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.139892 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9968046d-e1f9-4644-811b-45e9638d2ed4-registry-tls\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.153717 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9968046d-e1f9-4644-811b-45e9638d2ed4-bound-sa-token\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.154453 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6nvr\" (UniqueName: \"kubernetes.io/projected/9968046d-e1f9-4644-811b-45e9638d2ed4-kube-api-access-g6nvr\") pod \"image-registry-66df7c8f76-b5n6q\" (UID: \"9968046d-e1f9-4644-811b-45e9638d2ed4\") " pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.249694 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.630372 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b5n6q"] Dec 03 14:19:41 crc kubenswrapper[4751]: W1203 14:19:41.632544 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9968046d_e1f9_4644_811b_45e9638d2ed4.slice/crio-99a8cba74020dc162eee19e6f67524d3521b58a18fe6559d16d90cb0544ed980 WatchSource:0}: Error finding container 99a8cba74020dc162eee19e6f67524d3521b58a18fe6559d16d90cb0544ed980: Status 404 returned error can't find the container with id 99a8cba74020dc162eee19e6f67524d3521b58a18fe6559d16d90cb0544ed980 Dec 03 14:19:41 crc kubenswrapper[4751]: I1203 14:19:41.644094 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" event={"ID":"9968046d-e1f9-4644-811b-45e9638d2ed4","Type":"ContainerStarted","Data":"99a8cba74020dc162eee19e6f67524d3521b58a18fe6559d16d90cb0544ed980"} Dec 03 14:19:42 crc kubenswrapper[4751]: I1203 14:19:42.652356 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" event={"ID":"9968046d-e1f9-4644-811b-45e9638d2ed4","Type":"ContainerStarted","Data":"445c2aa7fce5022b87e0e312f28fccf4be07e42bda30c532215869c4439b317f"} Dec 03 14:19:42 crc kubenswrapper[4751]: I1203 14:19:42.652858 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:19:42 crc kubenswrapper[4751]: I1203 14:19:42.675685 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" podStartSLOduration=2.675672243 podStartE2EDuration="2.675672243s" podCreationTimestamp="2025-12-03 14:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:19:42.670744847 +0000 UTC m=+389.659100054" watchObservedRunningTime="2025-12-03 14:19:42.675672243 +0000 UTC m=+389.664027460" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.049568 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cclkx"] Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.049922 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cclkx" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerName="registry-server" containerID="cri-o://d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b" gracePeriod=30 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.058481 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4brfr"] Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.058761 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4brfr" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerName="registry-server" containerID="cri-o://0aa5cff644d3e7537e9a4348683a7127f6836df79f8641d2cc2e4b7b58bd03f0" gracePeriod=30 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.064895 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c725"] Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.065129 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerName="marketplace-operator" containerID="cri-o://ff72ca347a6bdd6dca4c8bf3015a982ee374ece74b229bf189f2317c39c2a6a4" gracePeriod=30 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.071891 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g592f"] Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.072173 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g592f" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerName="registry-server" containerID="cri-o://0c0b95b7954f47cbce7853213f96159cdbf157bb3be26611e4771f4d0ba36854" gracePeriod=30 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.084131 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sq9k5"] Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.085135 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.098139 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-drt67"] Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.098569 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-drt67" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerName="registry-server" containerID="cri-o://44f230d8075d5006f3fede773b3971afa2a04cf1dba93f473d1e483a31aa3987" gracePeriod=30 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.101973 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sq9k5"] Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.259167 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/224b9e4a-5a71-4559-84b6-9599c2dfd321-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sq9k5\" (UID: \"224b9e4a-5a71-4559-84b6-9599c2dfd321\") " pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.259361 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ml5d\" (UniqueName: \"kubernetes.io/projected/224b9e4a-5a71-4559-84b6-9599c2dfd321-kube-api-access-4ml5d\") pod \"marketplace-operator-79b997595-sq9k5\" (UID: \"224b9e4a-5a71-4559-84b6-9599c2dfd321\") " pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.259453 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/224b9e4a-5a71-4559-84b6-9599c2dfd321-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sq9k5\" (UID: \"224b9e4a-5a71-4559-84b6-9599c2dfd321\") " pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.361320 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/224b9e4a-5a71-4559-84b6-9599c2dfd321-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sq9k5\" (UID: \"224b9e4a-5a71-4559-84b6-9599c2dfd321\") " pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.361425 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/224b9e4a-5a71-4559-84b6-9599c2dfd321-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sq9k5\" (UID: \"224b9e4a-5a71-4559-84b6-9599c2dfd321\") " pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.361565 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ml5d\" (UniqueName: \"kubernetes.io/projected/224b9e4a-5a71-4559-84b6-9599c2dfd321-kube-api-access-4ml5d\") pod \"marketplace-operator-79b997595-sq9k5\" (UID: \"224b9e4a-5a71-4559-84b6-9599c2dfd321\") " pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.362453 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/224b9e4a-5a71-4559-84b6-9599c2dfd321-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sq9k5\" (UID: \"224b9e4a-5a71-4559-84b6-9599c2dfd321\") " pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.367479 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/224b9e4a-5a71-4559-84b6-9599c2dfd321-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sq9k5\" (UID: \"224b9e4a-5a71-4559-84b6-9599c2dfd321\") " pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.377073 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ml5d\" (UniqueName: \"kubernetes.io/projected/224b9e4a-5a71-4559-84b6-9599c2dfd321-kube-api-access-4ml5d\") pod \"marketplace-operator-79b997595-sq9k5\" (UID: \"224b9e4a-5a71-4559-84b6-9599c2dfd321\") " pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.408676 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.536042 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.658470 4751 generic.go:334] "Generic (PLEG): container finished" podID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerID="44f230d8075d5006f3fede773b3971afa2a04cf1dba93f473d1e483a31aa3987" exitCode=0 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.658574 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drt67" event={"ID":"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b","Type":"ContainerDied","Data":"44f230d8075d5006f3fede773b3971afa2a04cf1dba93f473d1e483a31aa3987"} Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.660780 4751 generic.go:334] "Generic (PLEG): container finished" podID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerID="0c0b95b7954f47cbce7853213f96159cdbf157bb3be26611e4771f4d0ba36854" exitCode=0 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.660803 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g592f" event={"ID":"3004e0b4-ba13-417b-88ef-439481ef93f4","Type":"ContainerDied","Data":"0c0b95b7954f47cbce7853213f96159cdbf157bb3be26611e4771f4d0ba36854"} Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.663239 4751 generic.go:334] "Generic (PLEG): container finished" podID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerID="ff72ca347a6bdd6dca4c8bf3015a982ee374ece74b229bf189f2317c39c2a6a4" exitCode=0 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.663298 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" event={"ID":"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02","Type":"ContainerDied","Data":"ff72ca347a6bdd6dca4c8bf3015a982ee374ece74b229bf189f2317c39c2a6a4"} Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.663339 4751 scope.go:117] "RemoveContainer" containerID="1a06f5749c9cd888a697aabdd9031e2bdd7b07c914015e34c8f3ca2faef21b16" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.664663 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sbdm\" (UniqueName: \"kubernetes.io/projected/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-kube-api-access-2sbdm\") pod \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.664762 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-catalog-content\") pod \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.664881 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-utilities\") pod \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\" (UID: \"d5bfbffc-4818-4710-9c57-a2a4f298bfe2\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.665656 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-utilities" (OuterVolumeSpecName: "utilities") pod "d5bfbffc-4818-4710-9c57-a2a4f298bfe2" (UID: "d5bfbffc-4818-4710-9c57-a2a4f298bfe2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.667349 4751 generic.go:334] "Generic (PLEG): container finished" podID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerID="0aa5cff644d3e7537e9a4348683a7127f6836df79f8641d2cc2e4b7b58bd03f0" exitCode=0 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.667414 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4brfr" event={"ID":"0878495c-85e0-469c-bee0-a4f6ce70d873","Type":"ContainerDied","Data":"0aa5cff644d3e7537e9a4348683a7127f6836df79f8641d2cc2e4b7b58bd03f0"} Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.669785 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerID="d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b" exitCode=0 Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.669814 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cclkx" event={"ID":"d5bfbffc-4818-4710-9c57-a2a4f298bfe2","Type":"ContainerDied","Data":"d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b"} Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.669863 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cclkx" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.669882 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cclkx" event={"ID":"d5bfbffc-4818-4710-9c57-a2a4f298bfe2","Type":"ContainerDied","Data":"91678aa47dfdc3d655abb9bab46b5368f515ddc01aa027910de2213d8b658f67"} Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.670768 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-kube-api-access-2sbdm" (OuterVolumeSpecName: "kube-api-access-2sbdm") pod "d5bfbffc-4818-4710-9c57-a2a4f298bfe2" (UID: "d5bfbffc-4818-4710-9c57-a2a4f298bfe2"). InnerVolumeSpecName "kube-api-access-2sbdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.718061 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5bfbffc-4818-4710-9c57-a2a4f298bfe2" (UID: "d5bfbffc-4818-4710-9c57-a2a4f298bfe2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.766185 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.766226 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sbdm\" (UniqueName: \"kubernetes.io/projected/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-kube-api-access-2sbdm\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.766240 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5bfbffc-4818-4710-9c57-a2a4f298bfe2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.769728 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.810721 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.811533 4751 scope.go:117] "RemoveContainer" containerID="d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.818513 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.820620 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.836374 4751 scope.go:117] "RemoveContainer" containerID="ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.853311 4751 scope.go:117] "RemoveContainer" containerID="116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.866778 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-catalog-content\") pod \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.866850 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb5p5\" (UniqueName: \"kubernetes.io/projected/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-kube-api-access-sb5p5\") pod \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.868166 4751 scope.go:117] "RemoveContainer" containerID="d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.869121 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-utilities\") pod \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\" (UID: \"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.869311 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-utilities" (OuterVolumeSpecName: "utilities") pod "d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" (UID: "d6b5fc30-18a3-4b4b-861f-4312c07eaa7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.870289 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:43 crc kubenswrapper[4751]: E1203 14:19:43.870418 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b\": container with ID starting with d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b not found: ID does not exist" containerID="d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.870462 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b"} err="failed to get container status \"d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b\": rpc error: code = NotFound desc = could not find container \"d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b\": container with ID starting with d86c516636dfac9f02e5b7e391e4d7d3d166df0e702e4b67bd8fbf8bed4e699b not found: ID does not exist" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.870487 4751 scope.go:117] "RemoveContainer" containerID="ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4" Dec 03 14:19:43 crc kubenswrapper[4751]: E1203 14:19:43.871071 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4\": container with ID starting with ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4 not found: ID does not exist" containerID="ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.871102 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4"} err="failed to get container status \"ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4\": rpc error: code = NotFound desc = could not find container \"ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4\": container with ID starting with ac674d8edf879c22515ee6eea5966db7420d6dc767df72a351290d941ab273f4 not found: ID does not exist" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.871127 4751 scope.go:117] "RemoveContainer" containerID="116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea" Dec 03 14:19:43 crc kubenswrapper[4751]: E1203 14:19:43.873716 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea\": container with ID starting with 116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea not found: ID does not exist" containerID="116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.873759 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea"} err="failed to get container status \"116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea\": rpc error: code = NotFound desc = could not find container \"116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea\": container with ID starting with 116bdd5cda6f9e3b2464a93161e5d7994ec200f11f66e9d59d3c2eb30e9f25ea not found: ID does not exist" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.877033 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-kube-api-access-sb5p5" (OuterVolumeSpecName: "kube-api-access-sb5p5") pod "d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" (UID: "d6b5fc30-18a3-4b4b-861f-4312c07eaa7b"). InnerVolumeSpecName "kube-api-access-sb5p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.970639 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-catalog-content\") pod \"3004e0b4-ba13-417b-88ef-439481ef93f4\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.970725 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf8d7\" (UniqueName: \"kubernetes.io/projected/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-kube-api-access-rf8d7\") pod \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.970745 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqzkt\" (UniqueName: \"kubernetes.io/projected/3004e0b4-ba13-417b-88ef-439481ef93f4-kube-api-access-kqzkt\") pod \"3004e0b4-ba13-417b-88ef-439481ef93f4\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.970775 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-trusted-ca\") pod \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.971238 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-operator-metrics\") pod \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\" (UID: \"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.971318 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-utilities\") pod \"3004e0b4-ba13-417b-88ef-439481ef93f4\" (UID: \"3004e0b4-ba13-417b-88ef-439481ef93f4\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.971359 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-catalog-content\") pod \"0878495c-85e0-469c-bee0-a4f6ce70d873\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.971409 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmcn2\" (UniqueName: \"kubernetes.io/projected/0878495c-85e0-469c-bee0-a4f6ce70d873-kube-api-access-lmcn2\") pod \"0878495c-85e0-469c-bee0-a4f6ce70d873\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.971429 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-utilities\") pod \"0878495c-85e0-469c-bee0-a4f6ce70d873\" (UID: \"0878495c-85e0-469c-bee0-a4f6ce70d873\") " Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.971698 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb5p5\" (UniqueName: \"kubernetes.io/projected/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-kube-api-access-sb5p5\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.971748 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" (UID: "1e78e3f2-fa8c-4c8d-8dce-f087e67acf02"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.972103 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-utilities" (OuterVolumeSpecName: "utilities") pod "3004e0b4-ba13-417b-88ef-439481ef93f4" (UID: "3004e0b4-ba13-417b-88ef-439481ef93f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.972537 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-utilities" (OuterVolumeSpecName: "utilities") pod "0878495c-85e0-469c-bee0-a4f6ce70d873" (UID: "0878495c-85e0-469c-bee0-a4f6ce70d873"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.974071 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" (UID: "1e78e3f2-fa8c-4c8d-8dce-f087e67acf02"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.975175 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sq9k5"] Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.975243 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0878495c-85e0-469c-bee0-a4f6ce70d873-kube-api-access-lmcn2" (OuterVolumeSpecName: "kube-api-access-lmcn2") pod "0878495c-85e0-469c-bee0-a4f6ce70d873" (UID: "0878495c-85e0-469c-bee0-a4f6ce70d873"). InnerVolumeSpecName "kube-api-access-lmcn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.978159 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" (UID: "d6b5fc30-18a3-4b4b-861f-4312c07eaa7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.978384 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3004e0b4-ba13-417b-88ef-439481ef93f4-kube-api-access-kqzkt" (OuterVolumeSpecName: "kube-api-access-kqzkt") pod "3004e0b4-ba13-417b-88ef-439481ef93f4" (UID: "3004e0b4-ba13-417b-88ef-439481ef93f4"). InnerVolumeSpecName "kube-api-access-kqzkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:43 crc kubenswrapper[4751]: W1203 14:19:43.979497 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod224b9e4a_5a71_4559_84b6_9599c2dfd321.slice/crio-f1ae342bfb7c120f40ca44134a7fa39b94fedec57867ca2465c53bfc60995fae WatchSource:0}: Error finding container f1ae342bfb7c120f40ca44134a7fa39b94fedec57867ca2465c53bfc60995fae: Status 404 returned error can't find the container with id f1ae342bfb7c120f40ca44134a7fa39b94fedec57867ca2465c53bfc60995fae Dec 03 14:19:43 crc kubenswrapper[4751]: I1203 14:19:43.979837 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-kube-api-access-rf8d7" (OuterVolumeSpecName: "kube-api-access-rf8d7") pod "1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" (UID: "1e78e3f2-fa8c-4c8d-8dce-f087e67acf02"). InnerVolumeSpecName "kube-api-access-rf8d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.017207 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3004e0b4-ba13-417b-88ef-439481ef93f4" (UID: "3004e0b4-ba13-417b-88ef-439481ef93f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.051070 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0878495c-85e0-469c-bee0-a4f6ce70d873" (UID: "0878495c-85e0-469c-bee0-a4f6ce70d873"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.063299 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cclkx"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073341 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073369 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf8d7\" (UniqueName: \"kubernetes.io/projected/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-kube-api-access-rf8d7\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073381 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqzkt\" (UniqueName: \"kubernetes.io/projected/3004e0b4-ba13-417b-88ef-439481ef93f4-kube-api-access-kqzkt\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073397 4751 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073407 4751 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073416 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073425 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3004e0b4-ba13-417b-88ef-439481ef93f4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073436 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073444 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmcn2\" (UniqueName: \"kubernetes.io/projected/0878495c-85e0-469c-bee0-a4f6ce70d873-kube-api-access-lmcn2\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073452 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878495c-85e0-469c-bee0-a4f6ce70d873-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.073882 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cclkx"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.676838 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g592f" event={"ID":"3004e0b4-ba13-417b-88ef-439481ef93f4","Type":"ContainerDied","Data":"603a4cc6178b2cde26d2a43f1654192467625813d3337fe9b8b634dcdb09d6a2"} Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.676900 4751 scope.go:117] "RemoveContainer" containerID="0c0b95b7954f47cbce7853213f96159cdbf157bb3be26611e4771f4d0ba36854" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.676912 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g592f" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.678444 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" event={"ID":"1e78e3f2-fa8c-4c8d-8dce-f087e67acf02","Type":"ContainerDied","Data":"fcaa6f8356927f7bd4be6db2ead8568187c44d80bb758d00a066402a14843e2d"} Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.678459 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4c725" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.680159 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" event={"ID":"224b9e4a-5a71-4559-84b6-9599c2dfd321","Type":"ContainerStarted","Data":"89b0ffb4fb2f92ba351016a43987743dd4d1e0e984e67a261ca20ff26f7223df"} Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.680254 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.680269 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" event={"ID":"224b9e4a-5a71-4559-84b6-9599c2dfd321","Type":"ContainerStarted","Data":"f1ae342bfb7c120f40ca44134a7fa39b94fedec57867ca2465c53bfc60995fae"} Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.681888 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4brfr" event={"ID":"0878495c-85e0-469c-bee0-a4f6ce70d873","Type":"ContainerDied","Data":"1f5d0977df47123d85185c3245d15b76bf60f8b94a31f2caf48ea634a27e5eb3"} Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.682008 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4brfr" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.684459 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.688157 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drt67" event={"ID":"d6b5fc30-18a3-4b4b-861f-4312c07eaa7b","Type":"ContainerDied","Data":"fcca06e807b744e351f5f2373d88344a6ca6ed36dcf9a5e77ff1a68f0ecbdff4"} Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.688259 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drt67" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.698291 4751 scope.go:117] "RemoveContainer" containerID="434bc487cb6c6417f8d34f6a784db7216ff45a145769439be76709a28e31e58f" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.718824 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" podStartSLOduration=1.718797587 podStartE2EDuration="1.718797587s" podCreationTimestamp="2025-12-03 14:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:19:44.711141392 +0000 UTC m=+391.699496619" watchObservedRunningTime="2025-12-03 14:19:44.718797587 +0000 UTC m=+391.707152844" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.730578 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g592f"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.735578 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g592f"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.752763 4751 scope.go:117] "RemoveContainer" containerID="028de8f73db9fd7a425b4111b9751169d71e1834dbe8b307942569d8a882489a" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.767569 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c725"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.772968 4751 scope.go:117] "RemoveContainer" containerID="ff72ca347a6bdd6dca4c8bf3015a982ee374ece74b229bf189f2317c39c2a6a4" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.783512 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c725"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.794241 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-drt67"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.796425 4751 scope.go:117] "RemoveContainer" containerID="0aa5cff644d3e7537e9a4348683a7127f6836df79f8641d2cc2e4b7b58bd03f0" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.802434 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-drt67"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.806408 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4brfr"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.810102 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4brfr"] Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.817256 4751 scope.go:117] "RemoveContainer" containerID="11a7139d121d1a44e21ec35a81095d27a1d4df0ac908d07c35775b30ff215556" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.837370 4751 scope.go:117] "RemoveContainer" containerID="d93a8983f54e13cd7a0c70c9f936831a5af51e553c89b71af10c685faeb8e8c0" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.852727 4751 scope.go:117] "RemoveContainer" containerID="44f230d8075d5006f3fede773b3971afa2a04cf1dba93f473d1e483a31aa3987" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.865745 4751 scope.go:117] "RemoveContainer" containerID="a1174ab7b11b05e34b38f2990fbd355a83fb7bf9fb96f06a5f6e7b170ff15496" Dec 03 14:19:44 crc kubenswrapper[4751]: I1203 14:19:44.879701 4751 scope.go:117] "RemoveContainer" containerID="e48c0612cda77cbbebded1b5ab9b043cc4ccbed631268104f09f53054051cdbb" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.261856 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z8ldt"] Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262065 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerName="extract-content" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262077 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerName="extract-content" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262088 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerName="extract-utilities" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262095 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerName="extract-utilities" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262103 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262109 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262120 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262125 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262135 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerName="extract-utilities" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262140 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerName="extract-utilities" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262148 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerName="extract-utilities" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262155 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerName="extract-utilities" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262166 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerName="marketplace-operator" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262172 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerName="marketplace-operator" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262180 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262186 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262192 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerName="marketplace-operator" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262198 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerName="marketplace-operator" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262207 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerName="extract-content" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262212 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerName="extract-content" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262219 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerName="extract-content" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262225 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerName="extract-content" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262233 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerName="extract-utilities" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262238 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerName="extract-utilities" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262246 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262252 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: E1203 14:19:45.262258 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerName="extract-content" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262263 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerName="extract-content" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262372 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262387 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerName="marketplace-operator" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262398 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262407 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262415 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" containerName="registry-server" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.262565 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" containerName="marketplace-operator" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.263221 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.270142 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.278249 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8ldt"] Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.320046 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0878495c-85e0-469c-bee0-a4f6ce70d873" path="/var/lib/kubelet/pods/0878495c-85e0-469c-bee0-a4f6ce70d873/volumes" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.320673 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e78e3f2-fa8c-4c8d-8dce-f087e67acf02" path="/var/lib/kubelet/pods/1e78e3f2-fa8c-4c8d-8dce-f087e67acf02/volumes" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.321099 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3004e0b4-ba13-417b-88ef-439481ef93f4" path="/var/lib/kubelet/pods/3004e0b4-ba13-417b-88ef-439481ef93f4/volumes" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.322032 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bfbffc-4818-4710-9c57-a2a4f298bfe2" path="/var/lib/kubelet/pods/d5bfbffc-4818-4710-9c57-a2a4f298bfe2/volumes" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.322567 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b5fc30-18a3-4b4b-861f-4312c07eaa7b" path="/var/lib/kubelet/pods/d6b5fc30-18a3-4b4b-861f-4312c07eaa7b/volumes" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.392815 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56xcm\" (UniqueName: \"kubernetes.io/projected/d9f96a5f-adfc-467c-91e7-631517b599a2-kube-api-access-56xcm\") pod \"redhat-marketplace-z8ldt\" (UID: \"d9f96a5f-adfc-467c-91e7-631517b599a2\") " pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.393973 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f96a5f-adfc-467c-91e7-631517b599a2-catalog-content\") pod \"redhat-marketplace-z8ldt\" (UID: \"d9f96a5f-adfc-467c-91e7-631517b599a2\") " pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.394072 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f96a5f-adfc-467c-91e7-631517b599a2-utilities\") pod \"redhat-marketplace-z8ldt\" (UID: \"d9f96a5f-adfc-467c-91e7-631517b599a2\") " pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.459635 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bjcgn"] Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.460682 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.463342 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.469757 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjcgn"] Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.495730 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f96a5f-adfc-467c-91e7-631517b599a2-utilities\") pod \"redhat-marketplace-z8ldt\" (UID: \"d9f96a5f-adfc-467c-91e7-631517b599a2\") " pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.495829 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56xcm\" (UniqueName: \"kubernetes.io/projected/d9f96a5f-adfc-467c-91e7-631517b599a2-kube-api-access-56xcm\") pod \"redhat-marketplace-z8ldt\" (UID: \"d9f96a5f-adfc-467c-91e7-631517b599a2\") " pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.495891 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f96a5f-adfc-467c-91e7-631517b599a2-catalog-content\") pod \"redhat-marketplace-z8ldt\" (UID: \"d9f96a5f-adfc-467c-91e7-631517b599a2\") " pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.496269 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f96a5f-adfc-467c-91e7-631517b599a2-utilities\") pod \"redhat-marketplace-z8ldt\" (UID: \"d9f96a5f-adfc-467c-91e7-631517b599a2\") " pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.496299 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f96a5f-adfc-467c-91e7-631517b599a2-catalog-content\") pod \"redhat-marketplace-z8ldt\" (UID: \"d9f96a5f-adfc-467c-91e7-631517b599a2\") " pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.513842 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56xcm\" (UniqueName: \"kubernetes.io/projected/d9f96a5f-adfc-467c-91e7-631517b599a2-kube-api-access-56xcm\") pod \"redhat-marketplace-z8ldt\" (UID: \"d9f96a5f-adfc-467c-91e7-631517b599a2\") " pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.588902 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.596406 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5bafe9-858d-4112-ae58-8ad005161e3d-utilities\") pod \"redhat-operators-bjcgn\" (UID: \"fd5bafe9-858d-4112-ae58-8ad005161e3d\") " pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.596443 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f4v2\" (UniqueName: \"kubernetes.io/projected/fd5bafe9-858d-4112-ae58-8ad005161e3d-kube-api-access-2f4v2\") pod \"redhat-operators-bjcgn\" (UID: \"fd5bafe9-858d-4112-ae58-8ad005161e3d\") " pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.596466 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5bafe9-858d-4112-ae58-8ad005161e3d-catalog-content\") pod \"redhat-operators-bjcgn\" (UID: \"fd5bafe9-858d-4112-ae58-8ad005161e3d\") " pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.697022 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f4v2\" (UniqueName: \"kubernetes.io/projected/fd5bafe9-858d-4112-ae58-8ad005161e3d-kube-api-access-2f4v2\") pod \"redhat-operators-bjcgn\" (UID: \"fd5bafe9-858d-4112-ae58-8ad005161e3d\") " pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.697270 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5bafe9-858d-4112-ae58-8ad005161e3d-utilities\") pod \"redhat-operators-bjcgn\" (UID: \"fd5bafe9-858d-4112-ae58-8ad005161e3d\") " pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.697291 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5bafe9-858d-4112-ae58-8ad005161e3d-catalog-content\") pod \"redhat-operators-bjcgn\" (UID: \"fd5bafe9-858d-4112-ae58-8ad005161e3d\") " pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.697696 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5bafe9-858d-4112-ae58-8ad005161e3d-catalog-content\") pod \"redhat-operators-bjcgn\" (UID: \"fd5bafe9-858d-4112-ae58-8ad005161e3d\") " pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.697799 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5bafe9-858d-4112-ae58-8ad005161e3d-utilities\") pod \"redhat-operators-bjcgn\" (UID: \"fd5bafe9-858d-4112-ae58-8ad005161e3d\") " pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.715450 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f4v2\" (UniqueName: \"kubernetes.io/projected/fd5bafe9-858d-4112-ae58-8ad005161e3d-kube-api-access-2f4v2\") pod \"redhat-operators-bjcgn\" (UID: \"fd5bafe9-858d-4112-ae58-8ad005161e3d\") " pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.785294 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:45 crc kubenswrapper[4751]: I1203 14:19:45.985620 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8ldt"] Dec 03 14:19:46 crc kubenswrapper[4751]: I1203 14:19:46.179672 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjcgn"] Dec 03 14:19:46 crc kubenswrapper[4751]: W1203 14:19:46.184450 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd5bafe9_858d_4112_ae58_8ad005161e3d.slice/crio-efdbe47864758fbcde94dfe42611673f4c041bedd32adc4b2f1d45e4ef943bf9 WatchSource:0}: Error finding container efdbe47864758fbcde94dfe42611673f4c041bedd32adc4b2f1d45e4ef943bf9: Status 404 returned error can't find the container with id efdbe47864758fbcde94dfe42611673f4c041bedd32adc4b2f1d45e4ef943bf9 Dec 03 14:19:46 crc kubenswrapper[4751]: I1203 14:19:46.709084 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8ldt" event={"ID":"d9f96a5f-adfc-467c-91e7-631517b599a2","Type":"ContainerStarted","Data":"37e484602a3508127bb33315e4ba56d520127b55c8c09ddb6c92fc96dd161693"} Dec 03 14:19:46 crc kubenswrapper[4751]: I1203 14:19:46.709484 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8ldt" event={"ID":"d9f96a5f-adfc-467c-91e7-631517b599a2","Type":"ContainerStarted","Data":"a3a0ee27d4259a649a9abfe793b1ef92666b397d5647220cc0df4dbc310ee78f"} Dec 03 14:19:46 crc kubenswrapper[4751]: I1203 14:19:46.710149 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjcgn" event={"ID":"fd5bafe9-858d-4112-ae58-8ad005161e3d","Type":"ContainerStarted","Data":"efdbe47864758fbcde94dfe42611673f4c041bedd32adc4b2f1d45e4ef943bf9"} Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.662111 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p6v9f"] Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.663152 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.668249 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.674081 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6v9f"] Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.716846 4751 generic.go:334] "Generic (PLEG): container finished" podID="d9f96a5f-adfc-467c-91e7-631517b599a2" containerID="37e484602a3508127bb33315e4ba56d520127b55c8c09ddb6c92fc96dd161693" exitCode=0 Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.717009 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8ldt" event={"ID":"d9f96a5f-adfc-467c-91e7-631517b599a2","Type":"ContainerDied","Data":"37e484602a3508127bb33315e4ba56d520127b55c8c09ddb6c92fc96dd161693"} Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.718650 4751 generic.go:334] "Generic (PLEG): container finished" podID="fd5bafe9-858d-4112-ae58-8ad005161e3d" containerID="6c8f685873bf298d99451a57f2eace122f2a7d85d338ad4a26c652d78c819147" exitCode=0 Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.718701 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjcgn" event={"ID":"fd5bafe9-858d-4112-ae58-8ad005161e3d","Type":"ContainerDied","Data":"6c8f685873bf298d99451a57f2eace122f2a7d85d338ad4a26c652d78c819147"} Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.720054 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78e1107f-c2a3-4dd7-b6f9-af9729fea0a3-catalog-content\") pod \"certified-operators-p6v9f\" (UID: \"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3\") " pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.720199 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78e1107f-c2a3-4dd7-b6f9-af9729fea0a3-utilities\") pod \"certified-operators-p6v9f\" (UID: \"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3\") " pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.721032 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c694q\" (UniqueName: \"kubernetes.io/projected/78e1107f-c2a3-4dd7-b6f9-af9729fea0a3-kube-api-access-c694q\") pod \"certified-operators-p6v9f\" (UID: \"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3\") " pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.822181 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78e1107f-c2a3-4dd7-b6f9-af9729fea0a3-utilities\") pod \"certified-operators-p6v9f\" (UID: \"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3\") " pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.822280 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c694q\" (UniqueName: \"kubernetes.io/projected/78e1107f-c2a3-4dd7-b6f9-af9729fea0a3-kube-api-access-c694q\") pod \"certified-operators-p6v9f\" (UID: \"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3\") " pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.822350 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78e1107f-c2a3-4dd7-b6f9-af9729fea0a3-catalog-content\") pod \"certified-operators-p6v9f\" (UID: \"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3\") " pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.822784 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78e1107f-c2a3-4dd7-b6f9-af9729fea0a3-utilities\") pod \"certified-operators-p6v9f\" (UID: \"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3\") " pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.822816 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78e1107f-c2a3-4dd7-b6f9-af9729fea0a3-catalog-content\") pod \"certified-operators-p6v9f\" (UID: \"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3\") " pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.844565 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c694q\" (UniqueName: \"kubernetes.io/projected/78e1107f-c2a3-4dd7-b6f9-af9729fea0a3-kube-api-access-c694q\") pod \"certified-operators-p6v9f\" (UID: \"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3\") " pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.869121 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dn4vn"] Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.870189 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.873674 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.876242 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dn4vn"] Dec 03 14:19:47 crc kubenswrapper[4751]: I1203 14:19:47.989036 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.025628 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c16a038-8221-4d8e-b455-e02c4be1c751-catalog-content\") pod \"community-operators-dn4vn\" (UID: \"4c16a038-8221-4d8e-b455-e02c4be1c751\") " pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.025694 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llpxf\" (UniqueName: \"kubernetes.io/projected/4c16a038-8221-4d8e-b455-e02c4be1c751-kube-api-access-llpxf\") pod \"community-operators-dn4vn\" (UID: \"4c16a038-8221-4d8e-b455-e02c4be1c751\") " pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.025775 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c16a038-8221-4d8e-b455-e02c4be1c751-utilities\") pod \"community-operators-dn4vn\" (UID: \"4c16a038-8221-4d8e-b455-e02c4be1c751\") " pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.126537 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llpxf\" (UniqueName: \"kubernetes.io/projected/4c16a038-8221-4d8e-b455-e02c4be1c751-kube-api-access-llpxf\") pod \"community-operators-dn4vn\" (UID: \"4c16a038-8221-4d8e-b455-e02c4be1c751\") " pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.126754 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c16a038-8221-4d8e-b455-e02c4be1c751-utilities\") pod \"community-operators-dn4vn\" (UID: \"4c16a038-8221-4d8e-b455-e02c4be1c751\") " pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.126793 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c16a038-8221-4d8e-b455-e02c4be1c751-catalog-content\") pod \"community-operators-dn4vn\" (UID: \"4c16a038-8221-4d8e-b455-e02c4be1c751\") " pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.127171 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c16a038-8221-4d8e-b455-e02c4be1c751-catalog-content\") pod \"community-operators-dn4vn\" (UID: \"4c16a038-8221-4d8e-b455-e02c4be1c751\") " pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.127261 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c16a038-8221-4d8e-b455-e02c4be1c751-utilities\") pod \"community-operators-dn4vn\" (UID: \"4c16a038-8221-4d8e-b455-e02c4be1c751\") " pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.145072 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llpxf\" (UniqueName: \"kubernetes.io/projected/4c16a038-8221-4d8e-b455-e02c4be1c751-kube-api-access-llpxf\") pod \"community-operators-dn4vn\" (UID: \"4c16a038-8221-4d8e-b455-e02c4be1c751\") " pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.189755 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.364757 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6v9f"] Dec 03 14:19:48 crc kubenswrapper[4751]: W1203 14:19:48.371490 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78e1107f_c2a3_4dd7_b6f9_af9729fea0a3.slice/crio-f2bdf3e787fb32f3fdf4539bae0a4dc761f526f489fc3a2dabc700daa557e5fd WatchSource:0}: Error finding container f2bdf3e787fb32f3fdf4539bae0a4dc761f526f489fc3a2dabc700daa557e5fd: Status 404 returned error can't find the container with id f2bdf3e787fb32f3fdf4539bae0a4dc761f526f489fc3a2dabc700daa557e5fd Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.567918 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dn4vn"] Dec 03 14:19:48 crc kubenswrapper[4751]: W1203 14:19:48.631653 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c16a038_8221_4d8e_b455_e02c4be1c751.slice/crio-4c4da88f59f10469c4c2e1bb842dec1858bc4976b266cc2a2d2e022f1195b40b WatchSource:0}: Error finding container 4c4da88f59f10469c4c2e1bb842dec1858bc4976b266cc2a2d2e022f1195b40b: Status 404 returned error can't find the container with id 4c4da88f59f10469c4c2e1bb842dec1858bc4976b266cc2a2d2e022f1195b40b Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.726358 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjcgn" event={"ID":"fd5bafe9-858d-4112-ae58-8ad005161e3d","Type":"ContainerStarted","Data":"192b7598249f6e5d6cbedb98ae47e5f0a2a218ecde67faa9b599d8157ada5dc9"} Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.729786 4751 generic.go:334] "Generic (PLEG): container finished" podID="78e1107f-c2a3-4dd7-b6f9-af9729fea0a3" containerID="d2d90d06ea479c5b3e88b13104c199ebcc9ab17b7ddd0b283760ad0864836ad2" exitCode=0 Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.729894 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6v9f" event={"ID":"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3","Type":"ContainerDied","Data":"d2d90d06ea479c5b3e88b13104c199ebcc9ab17b7ddd0b283760ad0864836ad2"} Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.729951 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6v9f" event={"ID":"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3","Type":"ContainerStarted","Data":"f2bdf3e787fb32f3fdf4539bae0a4dc761f526f489fc3a2dabc700daa557e5fd"} Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.739499 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4vn" event={"ID":"4c16a038-8221-4d8e-b455-e02c4be1c751","Type":"ContainerStarted","Data":"6dc1cafe897f567b2a16178944379bfcd70cf3b2e94de5c3f0dfe75f17bac03f"} Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.739630 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4vn" event={"ID":"4c16a038-8221-4d8e-b455-e02c4be1c751","Type":"ContainerStarted","Data":"4c4da88f59f10469c4c2e1bb842dec1858bc4976b266cc2a2d2e022f1195b40b"} Dec 03 14:19:48 crc kubenswrapper[4751]: I1203 14:19:48.748627 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8ldt" event={"ID":"d9f96a5f-adfc-467c-91e7-631517b599a2","Type":"ContainerStarted","Data":"bbc384ac42a86f4bcd10f06e82781b479554dd4cf58a0c65ea62ba814e942c54"} Dec 03 14:19:49 crc kubenswrapper[4751]: I1203 14:19:49.757440 4751 generic.go:334] "Generic (PLEG): container finished" podID="d9f96a5f-adfc-467c-91e7-631517b599a2" containerID="bbc384ac42a86f4bcd10f06e82781b479554dd4cf58a0c65ea62ba814e942c54" exitCode=0 Dec 03 14:19:49 crc kubenswrapper[4751]: I1203 14:19:49.757568 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8ldt" event={"ID":"d9f96a5f-adfc-467c-91e7-631517b599a2","Type":"ContainerDied","Data":"bbc384ac42a86f4bcd10f06e82781b479554dd4cf58a0c65ea62ba814e942c54"} Dec 03 14:19:49 crc kubenswrapper[4751]: I1203 14:19:49.762418 4751 generic.go:334] "Generic (PLEG): container finished" podID="fd5bafe9-858d-4112-ae58-8ad005161e3d" containerID="192b7598249f6e5d6cbedb98ae47e5f0a2a218ecde67faa9b599d8157ada5dc9" exitCode=0 Dec 03 14:19:49 crc kubenswrapper[4751]: I1203 14:19:49.762495 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjcgn" event={"ID":"fd5bafe9-858d-4112-ae58-8ad005161e3d","Type":"ContainerDied","Data":"192b7598249f6e5d6cbedb98ae47e5f0a2a218ecde67faa9b599d8157ada5dc9"} Dec 03 14:19:49 crc kubenswrapper[4751]: I1203 14:19:49.767557 4751 generic.go:334] "Generic (PLEG): container finished" podID="78e1107f-c2a3-4dd7-b6f9-af9729fea0a3" containerID="e6e6532590264550c022d5fe09bb74d62b146f5a499fb7f5e7098959b13ce2a1" exitCode=0 Dec 03 14:19:49 crc kubenswrapper[4751]: I1203 14:19:49.767620 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6v9f" event={"ID":"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3","Type":"ContainerDied","Data":"e6e6532590264550c022d5fe09bb74d62b146f5a499fb7f5e7098959b13ce2a1"} Dec 03 14:19:49 crc kubenswrapper[4751]: I1203 14:19:49.770620 4751 generic.go:334] "Generic (PLEG): container finished" podID="4c16a038-8221-4d8e-b455-e02c4be1c751" containerID="6dc1cafe897f567b2a16178944379bfcd70cf3b2e94de5c3f0dfe75f17bac03f" exitCode=0 Dec 03 14:19:49 crc kubenswrapper[4751]: I1203 14:19:49.770666 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4vn" event={"ID":"4c16a038-8221-4d8e-b455-e02c4be1c751","Type":"ContainerDied","Data":"6dc1cafe897f567b2a16178944379bfcd70cf3b2e94de5c3f0dfe75f17bac03f"} Dec 03 14:19:50 crc kubenswrapper[4751]: I1203 14:19:50.790431 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8ldt" event={"ID":"d9f96a5f-adfc-467c-91e7-631517b599a2","Type":"ContainerStarted","Data":"5fa783311de5968b660f0b31e18641f04b407f1195aa7a091feaafc21ce0b105"} Dec 03 14:19:50 crc kubenswrapper[4751]: I1203 14:19:50.811522 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z8ldt" podStartSLOduration=3.291930745 podStartE2EDuration="5.811506619s" podCreationTimestamp="2025-12-03 14:19:45 +0000 UTC" firstStartedPulling="2025-12-03 14:19:47.718729309 +0000 UTC m=+394.707084526" lastFinishedPulling="2025-12-03 14:19:50.238305173 +0000 UTC m=+397.226660400" observedRunningTime="2025-12-03 14:19:50.81040766 +0000 UTC m=+397.798762907" watchObservedRunningTime="2025-12-03 14:19:50.811506619 +0000 UTC m=+397.799861826" Dec 03 14:19:51 crc kubenswrapper[4751]: I1203 14:19:51.798527 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjcgn" event={"ID":"fd5bafe9-858d-4112-ae58-8ad005161e3d","Type":"ContainerStarted","Data":"106d6a21a4ec3745057b13be9fd01c9fc7f375e5321c4bef9b55a94911592714"} Dec 03 14:19:52 crc kubenswrapper[4751]: I1203 14:19:52.804319 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4vn" event={"ID":"4c16a038-8221-4d8e-b455-e02c4be1c751","Type":"ContainerStarted","Data":"b0584105e63a48490d7a14e4deaf70ad46af276ebc13357beb87e47c07b276f6"} Dec 03 14:19:52 crc kubenswrapper[4751]: I1203 14:19:52.808193 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6v9f" event={"ID":"78e1107f-c2a3-4dd7-b6f9-af9729fea0a3","Type":"ContainerStarted","Data":"bfad2ed337f65096f5bc8069b30b07feb773f20f27c1d3c40d410dbe3300ab02"} Dec 03 14:19:52 crc kubenswrapper[4751]: I1203 14:19:52.825210 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bjcgn" podStartSLOduration=5.232858838 podStartE2EDuration="7.825188394s" podCreationTimestamp="2025-12-03 14:19:45 +0000 UTC" firstStartedPulling="2025-12-03 14:19:47.720732742 +0000 UTC m=+394.709087959" lastFinishedPulling="2025-12-03 14:19:50.313062298 +0000 UTC m=+397.301417515" observedRunningTime="2025-12-03 14:19:51.821702776 +0000 UTC m=+398.810057983" watchObservedRunningTime="2025-12-03 14:19:52.825188394 +0000 UTC m=+399.813543611" Dec 03 14:19:52 crc kubenswrapper[4751]: I1203 14:19:52.845355 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p6v9f" podStartSLOduration=3.911783615 podStartE2EDuration="5.845337301s" podCreationTimestamp="2025-12-03 14:19:47 +0000 UTC" firstStartedPulling="2025-12-03 14:19:48.740521126 +0000 UTC m=+395.728876343" lastFinishedPulling="2025-12-03 14:19:50.674074812 +0000 UTC m=+397.662430029" observedRunningTime="2025-12-03 14:19:52.84115938 +0000 UTC m=+399.829514597" watchObservedRunningTime="2025-12-03 14:19:52.845337301 +0000 UTC m=+399.833692518" Dec 03 14:19:53 crc kubenswrapper[4751]: I1203 14:19:53.814632 4751 generic.go:334] "Generic (PLEG): container finished" podID="4c16a038-8221-4d8e-b455-e02c4be1c751" containerID="b0584105e63a48490d7a14e4deaf70ad46af276ebc13357beb87e47c07b276f6" exitCode=0 Dec 03 14:19:53 crc kubenswrapper[4751]: I1203 14:19:53.814704 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4vn" event={"ID":"4c16a038-8221-4d8e-b455-e02c4be1c751","Type":"ContainerDied","Data":"b0584105e63a48490d7a14e4deaf70ad46af276ebc13357beb87e47c07b276f6"} Dec 03 14:19:55 crc kubenswrapper[4751]: I1203 14:19:55.590356 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:55 crc kubenswrapper[4751]: I1203 14:19:55.590431 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:55 crc kubenswrapper[4751]: I1203 14:19:55.640750 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:55 crc kubenswrapper[4751]: I1203 14:19:55.786602 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:55 crc kubenswrapper[4751]: I1203 14:19:55.786698 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:19:55 crc kubenswrapper[4751]: I1203 14:19:55.869111 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z8ldt" Dec 03 14:19:56 crc kubenswrapper[4751]: I1203 14:19:56.820654 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bjcgn" podUID="fd5bafe9-858d-4112-ae58-8ad005161e3d" containerName="registry-server" probeResult="failure" output=< Dec 03 14:19:56 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 14:19:56 crc kubenswrapper[4751]: > Dec 03 14:19:56 crc kubenswrapper[4751]: I1203 14:19:56.835012 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4vn" event={"ID":"4c16a038-8221-4d8e-b455-e02c4be1c751","Type":"ContainerStarted","Data":"b98c0382b27d3e4d18375e5155f6f047dc8e593d8ff27308f6c0d6948626cf74"} Dec 03 14:19:56 crc kubenswrapper[4751]: I1203 14:19:56.851812 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dn4vn" podStartSLOduration=3.298200742 podStartE2EDuration="9.851794943s" podCreationTimestamp="2025-12-03 14:19:47 +0000 UTC" firstStartedPulling="2025-12-03 14:19:49.772108843 +0000 UTC m=+396.760464060" lastFinishedPulling="2025-12-03 14:19:56.325703044 +0000 UTC m=+403.314058261" observedRunningTime="2025-12-03 14:19:56.849678147 +0000 UTC m=+403.838033384" watchObservedRunningTime="2025-12-03 14:19:56.851794943 +0000 UTC m=+403.840150160" Dec 03 14:19:57 crc kubenswrapper[4751]: I1203 14:19:57.990008 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:57 crc kubenswrapper[4751]: I1203 14:19:57.990341 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:58 crc kubenswrapper[4751]: I1203 14:19:58.029044 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:19:58 crc kubenswrapper[4751]: I1203 14:19:58.190733 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:58 crc kubenswrapper[4751]: I1203 14:19:58.190788 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:58 crc kubenswrapper[4751]: I1203 14:19:58.225960 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:19:58 crc kubenswrapper[4751]: I1203 14:19:58.883349 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p6v9f" Dec 03 14:20:01 crc kubenswrapper[4751]: I1203 14:20:01.254611 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" Dec 03 14:20:01 crc kubenswrapper[4751]: I1203 14:20:01.312740 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2hknc"] Dec 03 14:20:05 crc kubenswrapper[4751]: I1203 14:20:05.820072 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:20:05 crc kubenswrapper[4751]: I1203 14:20:05.820374 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:20:05 crc kubenswrapper[4751]: I1203 14:20:05.853887 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:20:05 crc kubenswrapper[4751]: I1203 14:20:05.907084 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bjcgn" Dec 03 14:20:08 crc kubenswrapper[4751]: I1203 14:20:08.224975 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dn4vn" Dec 03 14:20:26 crc kubenswrapper[4751]: I1203 14:20:26.358025 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" podUID="2086abe6-48e1-4593-9789-b098b9b3142d" containerName="registry" containerID="cri-o://b44b70a3481533fc57c66c73b51ca98789982dc423c05e97e1efc91dbb2e9954" gracePeriod=30 Dec 03 14:20:26 crc kubenswrapper[4751]: I1203 14:20:26.971508 4751 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-2hknc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.25:5000/healthz\": dial tcp 10.217.0.25:5000: connect: connection refused" start-of-body= Dec 03 14:20:26 crc kubenswrapper[4751]: I1203 14:20:26.971604 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" podUID="2086abe6-48e1-4593-9789-b098b9b3142d" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.25:5000/healthz\": dial tcp 10.217.0.25:5000: connect: connection refused" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.014279 4751 generic.go:334] "Generic (PLEG): container finished" podID="2086abe6-48e1-4593-9789-b098b9b3142d" containerID="b44b70a3481533fc57c66c73b51ca98789982dc423c05e97e1efc91dbb2e9954" exitCode=0 Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.014639 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" event={"ID":"2086abe6-48e1-4593-9789-b098b9b3142d","Type":"ContainerDied","Data":"b44b70a3481533fc57c66c73b51ca98789982dc423c05e97e1efc91dbb2e9954"} Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.266278 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.338465 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-bound-sa-token\") pod \"2086abe6-48e1-4593-9789-b098b9b3142d\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.338623 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2086abe6-48e1-4593-9789-b098b9b3142d\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.338695 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2086abe6-48e1-4593-9789-b098b9b3142d-ca-trust-extracted\") pod \"2086abe6-48e1-4593-9789-b098b9b3142d\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.338744 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-trusted-ca\") pod \"2086abe6-48e1-4593-9789-b098b9b3142d\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.338782 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-registry-certificates\") pod \"2086abe6-48e1-4593-9789-b098b9b3142d\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.338805 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-registry-tls\") pod \"2086abe6-48e1-4593-9789-b098b9b3142d\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.338829 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6w8z\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-kube-api-access-p6w8z\") pod \"2086abe6-48e1-4593-9789-b098b9b3142d\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.338854 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2086abe6-48e1-4593-9789-b098b9b3142d-installation-pull-secrets\") pod \"2086abe6-48e1-4593-9789-b098b9b3142d\" (UID: \"2086abe6-48e1-4593-9789-b098b9b3142d\") " Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.339376 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2086abe6-48e1-4593-9789-b098b9b3142d" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.339747 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2086abe6-48e1-4593-9789-b098b9b3142d" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.344220 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2086abe6-48e1-4593-9789-b098b9b3142d" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.344261 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2086abe6-48e1-4593-9789-b098b9b3142d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2086abe6-48e1-4593-9789-b098b9b3142d" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.344405 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-kube-api-access-p6w8z" (OuterVolumeSpecName: "kube-api-access-p6w8z") pod "2086abe6-48e1-4593-9789-b098b9b3142d" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d"). InnerVolumeSpecName "kube-api-access-p6w8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.345144 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2086abe6-48e1-4593-9789-b098b9b3142d" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.346522 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2086abe6-48e1-4593-9789-b098b9b3142d" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.355112 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2086abe6-48e1-4593-9789-b098b9b3142d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2086abe6-48e1-4593-9789-b098b9b3142d" (UID: "2086abe6-48e1-4593-9789-b098b9b3142d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.440145 4751 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2086abe6-48e1-4593-9789-b098b9b3142d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.440195 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.440204 4751 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2086abe6-48e1-4593-9789-b098b9b3142d-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.440217 4751 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.440227 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6w8z\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-kube-api-access-p6w8z\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.440236 4751 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2086abe6-48e1-4593-9789-b098b9b3142d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:27 crc kubenswrapper[4751]: I1203 14:20:27.440246 4751 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2086abe6-48e1-4593-9789-b098b9b3142d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:28 crc kubenswrapper[4751]: I1203 14:20:28.020061 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" event={"ID":"2086abe6-48e1-4593-9789-b098b9b3142d","Type":"ContainerDied","Data":"3aa699ccdb732b58dd927b0a6ab9c956dab41ff2aea1795ae78328b77f06fe8f"} Dec 03 14:20:28 crc kubenswrapper[4751]: I1203 14:20:28.020145 4751 scope.go:117] "RemoveContainer" containerID="b44b70a3481533fc57c66c73b51ca98789982dc423c05e97e1efc91dbb2e9954" Dec 03 14:20:28 crc kubenswrapper[4751]: I1203 14:20:28.020141 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2hknc" Dec 03 14:20:28 crc kubenswrapper[4751]: I1203 14:20:28.046242 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2hknc"] Dec 03 14:20:28 crc kubenswrapper[4751]: I1203 14:20:28.049877 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2hknc"] Dec 03 14:20:29 crc kubenswrapper[4751]: I1203 14:20:29.327215 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2086abe6-48e1-4593-9789-b098b9b3142d" path="/var/lib/kubelet/pods/2086abe6-48e1-4593-9789-b098b9b3142d/volumes" Dec 03 14:20:35 crc kubenswrapper[4751]: I1203 14:20:35.819527 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:20:35 crc kubenswrapper[4751]: I1203 14:20:35.819912 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:20:35 crc kubenswrapper[4751]: I1203 14:20:35.819958 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:20:35 crc kubenswrapper[4751]: I1203 14:20:35.820542 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2297df34a8e0def51a3b8b80a4b2f09fb12dbe5d21891df62cf314ccbd2348b9"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:20:35 crc kubenswrapper[4751]: I1203 14:20:35.820592 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://2297df34a8e0def51a3b8b80a4b2f09fb12dbe5d21891df62cf314ccbd2348b9" gracePeriod=600 Dec 03 14:20:37 crc kubenswrapper[4751]: I1203 14:20:37.074719 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="2297df34a8e0def51a3b8b80a4b2f09fb12dbe5d21891df62cf314ccbd2348b9" exitCode=0 Dec 03 14:20:37 crc kubenswrapper[4751]: I1203 14:20:37.074763 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"2297df34a8e0def51a3b8b80a4b2f09fb12dbe5d21891df62cf314ccbd2348b9"} Dec 03 14:20:37 crc kubenswrapper[4751]: I1203 14:20:37.074804 4751 scope.go:117] "RemoveContainer" containerID="0bda661a014e45e2616352737663c676e8da3ba0a150bf9b3633b0bc631b252d" Dec 03 14:20:38 crc kubenswrapper[4751]: I1203 14:20:38.081874 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"4b56cbf4f0b4c2218f219ed756acd543426f8d25cb023bb8168954d5d1f9f3a6"} Dec 03 14:23:05 crc kubenswrapper[4751]: I1203 14:23:05.820250 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:23:05 crc kubenswrapper[4751]: I1203 14:23:05.821051 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:23:35 crc kubenswrapper[4751]: I1203 14:23:35.820982 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:23:35 crc kubenswrapper[4751]: I1203 14:23:35.822964 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:24:05 crc kubenswrapper[4751]: I1203 14:24:05.820514 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:24:05 crc kubenswrapper[4751]: I1203 14:24:05.821009 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:24:05 crc kubenswrapper[4751]: I1203 14:24:05.821056 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:24:05 crc kubenswrapper[4751]: I1203 14:24:05.821641 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b56cbf4f0b4c2218f219ed756acd543426f8d25cb023bb8168954d5d1f9f3a6"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:24:05 crc kubenswrapper[4751]: I1203 14:24:05.821695 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://4b56cbf4f0b4c2218f219ed756acd543426f8d25cb023bb8168954d5d1f9f3a6" gracePeriod=600 Dec 03 14:24:06 crc kubenswrapper[4751]: I1203 14:24:06.343072 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="4b56cbf4f0b4c2218f219ed756acd543426f8d25cb023bb8168954d5d1f9f3a6" exitCode=0 Dec 03 14:24:06 crc kubenswrapper[4751]: I1203 14:24:06.343166 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"4b56cbf4f0b4c2218f219ed756acd543426f8d25cb023bb8168954d5d1f9f3a6"} Dec 03 14:24:06 crc kubenswrapper[4751]: I1203 14:24:06.343436 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"022494d9f3dab8e8955cedfdb1fecb645926d841488965605deafc394884f056"} Dec 03 14:24:06 crc kubenswrapper[4751]: I1203 14:24:06.343458 4751 scope.go:117] "RemoveContainer" containerID="2297df34a8e0def51a3b8b80a4b2f09fb12dbe5d21891df62cf314ccbd2348b9" Dec 03 14:25:39 crc kubenswrapper[4751]: I1203 14:25:39.972113 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj"] Dec 03 14:25:39 crc kubenswrapper[4751]: E1203 14:25:39.972994 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2086abe6-48e1-4593-9789-b098b9b3142d" containerName="registry" Dec 03 14:25:39 crc kubenswrapper[4751]: I1203 14:25:39.973014 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2086abe6-48e1-4593-9789-b098b9b3142d" containerName="registry" Dec 03 14:25:39 crc kubenswrapper[4751]: I1203 14:25:39.973176 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2086abe6-48e1-4593-9789-b098b9b3142d" containerName="registry" Dec 03 14:25:39 crc kubenswrapper[4751]: I1203 14:25:39.974399 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:39 crc kubenswrapper[4751]: I1203 14:25:39.977962 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 14:25:39 crc kubenswrapper[4751]: I1203 14:25:39.987317 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj"] Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.077968 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.078031 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gwdl\" (UniqueName: \"kubernetes.io/projected/f6516bb2-c6eb-464d-a533-03917cbf52e4-kube-api-access-6gwdl\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.078068 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.179403 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gwdl\" (UniqueName: \"kubernetes.io/projected/f6516bb2-c6eb-464d-a533-03917cbf52e4-kube-api-access-6gwdl\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.179483 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.179553 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.180221 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.180316 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.200839 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gwdl\" (UniqueName: \"kubernetes.io/projected/f6516bb2-c6eb-464d-a533-03917cbf52e4-kube-api-access-6gwdl\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.291862 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:40 crc kubenswrapper[4751]: I1203 14:25:40.496513 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj"] Dec 03 14:25:41 crc kubenswrapper[4751]: I1203 14:25:41.001494 4751 generic.go:334] "Generic (PLEG): container finished" podID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerID="792e758caa89195d156323be5c74cf196b3d221b19f0dadb9bd3e4118153f56b" exitCode=0 Dec 03 14:25:41 crc kubenswrapper[4751]: I1203 14:25:41.001591 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" event={"ID":"f6516bb2-c6eb-464d-a533-03917cbf52e4","Type":"ContainerDied","Data":"792e758caa89195d156323be5c74cf196b3d221b19f0dadb9bd3e4118153f56b"} Dec 03 14:25:41 crc kubenswrapper[4751]: I1203 14:25:41.001863 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" event={"ID":"f6516bb2-c6eb-464d-a533-03917cbf52e4","Type":"ContainerStarted","Data":"23eadf12f84a2a14af4a73a74f55d32a63136f23ecb1e27773d4f8365597a108"} Dec 03 14:25:41 crc kubenswrapper[4751]: I1203 14:25:41.003596 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:25:42 crc kubenswrapper[4751]: I1203 14:25:42.008996 4751 generic.go:334] "Generic (PLEG): container finished" podID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerID="02396e22b6981e7cea443c64db5572c3dff0e624c46921f3e13cedc29c1343e5" exitCode=0 Dec 03 14:25:42 crc kubenswrapper[4751]: I1203 14:25:42.009043 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" event={"ID":"f6516bb2-c6eb-464d-a533-03917cbf52e4","Type":"ContainerDied","Data":"02396e22b6981e7cea443c64db5572c3dff0e624c46921f3e13cedc29c1343e5"} Dec 03 14:25:43 crc kubenswrapper[4751]: I1203 14:25:43.015371 4751 generic.go:334] "Generic (PLEG): container finished" podID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerID="6e209074d9b22439a6fe043f9062215f3a99be363ab9f15375d8b6ba107e9ff1" exitCode=0 Dec 03 14:25:43 crc kubenswrapper[4751]: I1203 14:25:43.015423 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" event={"ID":"f6516bb2-c6eb-464d-a533-03917cbf52e4","Type":"ContainerDied","Data":"6e209074d9b22439a6fe043f9062215f3a99be363ab9f15375d8b6ba107e9ff1"} Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.293663 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.341859 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gwdl\" (UniqueName: \"kubernetes.io/projected/f6516bb2-c6eb-464d-a533-03917cbf52e4-kube-api-access-6gwdl\") pod \"f6516bb2-c6eb-464d-a533-03917cbf52e4\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.341911 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-util\") pod \"f6516bb2-c6eb-464d-a533-03917cbf52e4\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.342020 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-bundle\") pod \"f6516bb2-c6eb-464d-a533-03917cbf52e4\" (UID: \"f6516bb2-c6eb-464d-a533-03917cbf52e4\") " Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.344232 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-bundle" (OuterVolumeSpecName: "bundle") pod "f6516bb2-c6eb-464d-a533-03917cbf52e4" (UID: "f6516bb2-c6eb-464d-a533-03917cbf52e4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.346884 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6516bb2-c6eb-464d-a533-03917cbf52e4-kube-api-access-6gwdl" (OuterVolumeSpecName: "kube-api-access-6gwdl") pod "f6516bb2-c6eb-464d-a533-03917cbf52e4" (UID: "f6516bb2-c6eb-464d-a533-03917cbf52e4"). InnerVolumeSpecName "kube-api-access-6gwdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.355804 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-util" (OuterVolumeSpecName: "util") pod "f6516bb2-c6eb-464d-a533-03917cbf52e4" (UID: "f6516bb2-c6eb-464d-a533-03917cbf52e4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.444314 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gwdl\" (UniqueName: \"kubernetes.io/projected/f6516bb2-c6eb-464d-a533-03917cbf52e4-kube-api-access-6gwdl\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.444388 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-util\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:44 crc kubenswrapper[4751]: I1203 14:25:44.444410 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6516bb2-c6eb-464d-a533-03917cbf52e4-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:45 crc kubenswrapper[4751]: I1203 14:25:45.055080 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" event={"ID":"f6516bb2-c6eb-464d-a533-03917cbf52e4","Type":"ContainerDied","Data":"23eadf12f84a2a14af4a73a74f55d32a63136f23ecb1e27773d4f8365597a108"} Dec 03 14:25:45 crc kubenswrapper[4751]: I1203 14:25:45.055420 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23eadf12f84a2a14af4a73a74f55d32a63136f23ecb1e27773d4f8365597a108" Dec 03 14:25:45 crc kubenswrapper[4751]: I1203 14:25:45.055172 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj" Dec 03 14:25:51 crc kubenswrapper[4751]: I1203 14:25:51.628901 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kwchh"] Dec 03 14:25:51 crc kubenswrapper[4751]: I1203 14:25:51.629786 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="nbdb" containerID="cri-o://42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5" gracePeriod=30 Dec 03 14:25:51 crc kubenswrapper[4751]: I1203 14:25:51.630175 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="sbdb" containerID="cri-o://ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29" gracePeriod=30 Dec 03 14:25:51 crc kubenswrapper[4751]: I1203 14:25:51.630245 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf" gracePeriod=30 Dec 03 14:25:51 crc kubenswrapper[4751]: I1203 14:25:51.630291 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="northd" containerID="cri-o://104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c" gracePeriod=30 Dec 03 14:25:51 crc kubenswrapper[4751]: I1203 14:25:51.630355 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kube-rbac-proxy-node" containerID="cri-o://8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a" gracePeriod=30 Dec 03 14:25:51 crc kubenswrapper[4751]: I1203 14:25:51.630433 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovn-acl-logging" containerID="cri-o://77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4" gracePeriod=30 Dec 03 14:25:51 crc kubenswrapper[4751]: I1203 14:25:51.630440 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovn-controller" containerID="cri-o://26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7" gracePeriod=30 Dec 03 14:25:51 crc kubenswrapper[4751]: I1203 14:25:51.692979 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" containerID="cri-o://21804d9d39fcedb6c0cbabe7c8de6d85feba3d2047448ebce8dbf7b1af5efde0" gracePeriod=30 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.099828 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98mjq_6a216adb-632d-4134-8c61-61fe6b8c5f71/kube-multus/2.log" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.100347 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98mjq_6a216adb-632d-4134-8c61-61fe6b8c5f71/kube-multus/1.log" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.100403 4751 generic.go:334] "Generic (PLEG): container finished" podID="6a216adb-632d-4134-8c61-61fe6b8c5f71" containerID="f3ddb4e9890a3a646a504a20509bca72cba34c27e51e9ed333c969148058d81a" exitCode=2 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.100480 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98mjq" event={"ID":"6a216adb-632d-4134-8c61-61fe6b8c5f71","Type":"ContainerDied","Data":"f3ddb4e9890a3a646a504a20509bca72cba34c27e51e9ed333c969148058d81a"} Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.100525 4751 scope.go:117] "RemoveContainer" containerID="f26c96913955bb014b1ac71389acea8daeb976964dea908c649f382d5e688801" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.101027 4751 scope.go:117] "RemoveContainer" containerID="f3ddb4e9890a3a646a504a20509bca72cba34c27e51e9ed333c969148058d81a" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.104445 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovnkube-controller/3.log" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.111982 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovn-acl-logging/0.log" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.113862 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovn-controller/0.log" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115113 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="21804d9d39fcedb6c0cbabe7c8de6d85feba3d2047448ebce8dbf7b1af5efde0" exitCode=0 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115141 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29" exitCode=0 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115150 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5" exitCode=0 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115160 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c" exitCode=0 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115169 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf" exitCode=0 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115176 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a" exitCode=0 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115183 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4" exitCode=143 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115198 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerID="26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7" exitCode=143 Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115219 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"21804d9d39fcedb6c0cbabe7c8de6d85feba3d2047448ebce8dbf7b1af5efde0"} Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115250 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29"} Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115265 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5"} Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115284 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c"} Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115301 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf"} Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115318 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a"} Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115356 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4"} Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.115372 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7"} Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.144423 4751 scope.go:117] "RemoveContainer" containerID="fa18a539c068f5e82a4f3159bd0857ba914bb6213cd1c585b6eebf22aeb7be3f" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.334829 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovn-acl-logging/0.log" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.335673 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovn-controller/0.log" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.336236 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.409051 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gqmr9"] Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.409515 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerName="extract" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.409590 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerName="extract" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.409649 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="sbdb" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.409712 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="sbdb" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.409776 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="northd" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.409842 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="northd" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.409904 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.409957 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.410019 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kubecfg-setup" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.410102 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kubecfg-setup" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.410176 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerName="pull" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.410245 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerName="pull" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.410316 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovn-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.410401 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovn-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.410469 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.410529 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.410593 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.410654 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.410720 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.410783 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.410847 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerName="util" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.410907 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerName="util" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.410973 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="nbdb" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.411039 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="nbdb" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.411111 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovn-acl-logging" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.411173 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovn-acl-logging" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.411240 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kube-rbac-proxy-node" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.412249 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kube-rbac-proxy-node" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.412556 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kube-rbac-proxy-node" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.412647 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovn-acl-logging" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.412726 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="sbdb" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.412804 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.412883 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.412955 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.413022 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="northd" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.413090 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="nbdb" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.413158 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6516bb2-c6eb-464d-a533-03917cbf52e4" containerName="extract" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.413236 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovn-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.413304 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.413515 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.413585 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: E1203 14:25:52.413644 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.413707 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.413883 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.413965 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" containerName="ovnkube-controller" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.416041 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446175 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-ovn\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446232 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-script-lib\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446262 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446258 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446302 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-node-log\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446322 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-slash\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446361 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-var-lib-openvswitch\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446378 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-bin\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446398 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbbql\" (UniqueName: \"kubernetes.io/projected/a5526cae-f2a4-4094-a08a-fbf69cb11593-kube-api-access-gbbql\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446411 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-log-socket\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446439 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-systemd-units\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446427 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446457 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-config\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446539 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-systemd\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446583 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovn-node-metrics-cert\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446616 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-kubelet\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446641 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-ovn-kubernetes\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446673 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-env-overrides\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446687 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446700 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-etc-openvswitch\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446722 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446722 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-netns\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446745 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446769 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-netd\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446786 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-openvswitch\") pod \"a5526cae-f2a4-4094-a08a-fbf69cb11593\" (UID: \"a5526cae-f2a4-4094-a08a-fbf69cb11593\") " Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446800 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446936 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-var-lib-openvswitch\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446965 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-log-socket\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446983 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-run-systemd\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.446999 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-kubelet\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447015 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-run-ovn\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447040 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrgbx\" (UniqueName: \"kubernetes.io/projected/6b82b434-4871-4bd3-bb6b-37abc2d7d838-kube-api-access-xrgbx\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447104 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-systemd-units\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447123 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-run-ovn-kubernetes\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447140 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-cni-netd\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447160 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-slash\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447185 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447227 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-run-netns\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447252 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b82b434-4871-4bd3-bb6b-37abc2d7d838-ovnkube-script-lib\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447271 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b82b434-4871-4bd3-bb6b-37abc2d7d838-env-overrides\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447288 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b82b434-4871-4bd3-bb6b-37abc2d7d838-ovnkube-config\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447304 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-etc-openvswitch\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447353 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-run-openvswitch\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447374 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-node-log\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447391 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-cni-bin\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447407 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b82b434-4871-4bd3-bb6b-37abc2d7d838-ovn-node-metrics-cert\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447451 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447553 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-node-log" (OuterVolumeSpecName: "node-log") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447563 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-log-socket" (OuterVolumeSpecName: "log-socket") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447589 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-slash" (OuterVolumeSpecName: "host-slash") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447597 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447616 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447642 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447665 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447743 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447470 4751 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447857 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447903 4751 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447923 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447936 4751 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447949 4751 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.447989 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.448289 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.452606 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5526cae-f2a4-4094-a08a-fbf69cb11593-kube-api-access-gbbql" (OuterVolumeSpecName: "kube-api-access-gbbql") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "kube-api-access-gbbql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.456703 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.462714 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a5526cae-f2a4-4094-a08a-fbf69cb11593" (UID: "a5526cae-f2a4-4094-a08a-fbf69cb11593"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548495 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-systemd-units\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548543 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-run-ovn-kubernetes\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548563 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-cni-netd\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548583 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-slash\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548603 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548598 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-systemd-units\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548631 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-run-netns\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548651 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b82b434-4871-4bd3-bb6b-37abc2d7d838-ovnkube-script-lib\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548668 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b82b434-4871-4bd3-bb6b-37abc2d7d838-env-overrides\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548673 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548669 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-run-ovn-kubernetes\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548687 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b82b434-4871-4bd3-bb6b-37abc2d7d838-ovnkube-config\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548823 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-etc-openvswitch\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548925 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-run-openvswitch\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548963 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-node-log\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549001 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-cni-bin\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549059 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b82b434-4871-4bd3-bb6b-37abc2d7d838-ovn-node-metrics-cert\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549115 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-var-lib-openvswitch\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549154 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-log-socket\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549185 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-run-systemd\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549216 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-kubelet\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549248 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-run-ovn\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549288 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrgbx\" (UniqueName: \"kubernetes.io/projected/6b82b434-4871-4bd3-bb6b-37abc2d7d838-kube-api-access-xrgbx\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549342 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b82b434-4871-4bd3-bb6b-37abc2d7d838-ovnkube-script-lib\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548698 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-run-netns\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.548662 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-slash\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549439 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b82b434-4871-4bd3-bb6b-37abc2d7d838-env-overrides\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549487 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-run-systemd\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549500 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b82b434-4871-4bd3-bb6b-37abc2d7d838-ovnkube-config\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549515 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-var-lib-openvswitch\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549544 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-log-socket\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549552 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-kubelet\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549580 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-node-log\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549610 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-run-ovn\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549610 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-cni-netd\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549625 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-etc-openvswitch\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549642 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-run-openvswitch\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549655 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b82b434-4871-4bd3-bb6b-37abc2d7d838-host-cni-bin\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549752 4751 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549773 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbbql\" (UniqueName: \"kubernetes.io/projected/a5526cae-f2a4-4094-a08a-fbf69cb11593-kube-api-access-gbbql\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549787 4751 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549800 4751 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549814 4751 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549827 4751 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5526cae-f2a4-4094-a08a-fbf69cb11593-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549844 4751 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549855 4751 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549867 4751 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5526cae-f2a4-4094-a08a-fbf69cb11593-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549879 4751 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549890 4751 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549901 4751 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549913 4751 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.549924 4751 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5526cae-f2a4-4094-a08a-fbf69cb11593-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.552668 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b82b434-4871-4bd3-bb6b-37abc2d7d838-ovn-node-metrics-cert\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.565066 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrgbx\" (UniqueName: \"kubernetes.io/projected/6b82b434-4871-4bd3-bb6b-37abc2d7d838-kube-api-access-xrgbx\") pod \"ovnkube-node-gqmr9\" (UID: \"6b82b434-4871-4bd3-bb6b-37abc2d7d838\") " pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: I1203 14:25:52.730075 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:52 crc kubenswrapper[4751]: W1203 14:25:52.750366 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b82b434_4871_4bd3_bb6b_37abc2d7d838.slice/crio-e1800e8f0f05a8c369b4bd28ef3b138a49956d4fc371611e6c4f60746f64d9fe WatchSource:0}: Error finding container e1800e8f0f05a8c369b4bd28ef3b138a49956d4fc371611e6c4f60746f64d9fe: Status 404 returned error can't find the container with id e1800e8f0f05a8c369b4bd28ef3b138a49956d4fc371611e6c4f60746f64d9fe Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.125495 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovn-acl-logging/0.log" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.126432 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwchh_a5526cae-f2a4-4094-a08a-fbf69cb11593/ovn-controller/0.log" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.127141 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" event={"ID":"a5526cae-f2a4-4094-a08a-fbf69cb11593","Type":"ContainerDied","Data":"22db7d10f75c044e4170cf3156b0cec07a600ead0b28caeb9390c7167bbbcfbc"} Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.127264 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwchh" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.127283 4751 scope.go:117] "RemoveContainer" containerID="21804d9d39fcedb6c0cbabe7c8de6d85feba3d2047448ebce8dbf7b1af5efde0" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.128833 4751 generic.go:334] "Generic (PLEG): container finished" podID="6b82b434-4871-4bd3-bb6b-37abc2d7d838" containerID="523e7247c7602e1a58382ff17844b69940b71993ed6d07402891a23bb57decff" exitCode=0 Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.128914 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerDied","Data":"523e7247c7602e1a58382ff17844b69940b71993ed6d07402891a23bb57decff"} Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.128945 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerStarted","Data":"e1800e8f0f05a8c369b4bd28ef3b138a49956d4fc371611e6c4f60746f64d9fe"} Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.131678 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98mjq_6a216adb-632d-4134-8c61-61fe6b8c5f71/kube-multus/2.log" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.131727 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98mjq" event={"ID":"6a216adb-632d-4134-8c61-61fe6b8c5f71","Type":"ContainerStarted","Data":"97e6757c83816f7181261ff4f641c81ae1ddeef72405a601aa9d724601c120ae"} Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.174790 4751 scope.go:117] "RemoveContainer" containerID="ac293ad67560998b43406e7acd93896d81398996e43a2b7e5df6a4d5794eeb29" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.201171 4751 scope.go:117] "RemoveContainer" containerID="42ff7aaf5e6a948fcb50e4c896807b977d2c651eef92011b86b5df32f19fa3d5" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.224471 4751 scope.go:117] "RemoveContainer" containerID="104d3bfa9a37dce20a737bf9d416395e432196051e96def7ae49d341d2912e3c" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.235438 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kwchh"] Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.240820 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kwchh"] Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.246412 4751 scope.go:117] "RemoveContainer" containerID="4b693d6eeadc9bf4af3f115fd8646138f7e25675fb84ae92b97a456e3b5a14cf" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.252659 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch"] Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.253262 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.256534 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.256565 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.256735 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-m9wlq" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.264178 4751 scope.go:117] "RemoveContainer" containerID="8336533a450796683f0bdfccc947c18c05e100d13e7e5a076e4dde52ddbb524a" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.277941 4751 scope.go:117] "RemoveContainer" containerID="77fd3bfbc463e6afbd45d112bf46ad8a39f25cd39ebbc0e17dfcd160ca29d7c4" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.290999 4751 scope.go:117] "RemoveContainer" containerID="26dadf0f116a114d6755804ab91736019d0d3e047d14515f513eda2148d706d7" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.313684 4751 scope.go:117] "RemoveContainer" containerID="a83c87f4efec1b57dd6a8832e17f49549104b319b73a17f068de1cedaa3c18c4" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.330705 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5526cae-f2a4-4094-a08a-fbf69cb11593" path="/var/lib/kubelet/pods/a5526cae-f2a4-4094-a08a-fbf69cb11593/volumes" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.360071 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2v7\" (UniqueName: \"kubernetes.io/projected/2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca-kube-api-access-wz2v7\") pod \"obo-prometheus-operator-668cf9dfbb-96kch\" (UID: \"2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.375745 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm"] Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.376398 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.378901 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-2zdg6" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.379187 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.398755 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm"] Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.399348 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.460787 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01521e70-1366-4e52-9f9a-885522387a0e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm\" (UID: \"01521e70-1366-4e52-9f9a-885522387a0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.460894 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e248019-bf73-4c6d-a551-6c62dcf6ec11-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm\" (UID: \"7e248019-bf73-4c6d-a551-6c62dcf6ec11\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.460985 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2v7\" (UniqueName: \"kubernetes.io/projected/2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca-kube-api-access-wz2v7\") pod \"obo-prometheus-operator-668cf9dfbb-96kch\" (UID: \"2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.461074 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01521e70-1366-4e52-9f9a-885522387a0e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm\" (UID: \"01521e70-1366-4e52-9f9a-885522387a0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.461164 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e248019-bf73-4c6d-a551-6c62dcf6ec11-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm\" (UID: \"7e248019-bf73-4c6d-a551-6c62dcf6ec11\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.484915 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2v7\" (UniqueName: \"kubernetes.io/projected/2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca-kube-api-access-wz2v7\") pod \"obo-prometheus-operator-668cf9dfbb-96kch\" (UID: \"2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.562222 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01521e70-1366-4e52-9f9a-885522387a0e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm\" (UID: \"01521e70-1366-4e52-9f9a-885522387a0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.562273 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e248019-bf73-4c6d-a551-6c62dcf6ec11-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm\" (UID: \"7e248019-bf73-4c6d-a551-6c62dcf6ec11\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.562336 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01521e70-1366-4e52-9f9a-885522387a0e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm\" (UID: \"01521e70-1366-4e52-9f9a-885522387a0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.562368 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e248019-bf73-4c6d-a551-6c62dcf6ec11-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm\" (UID: \"7e248019-bf73-4c6d-a551-6c62dcf6ec11\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.565962 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e248019-bf73-4c6d-a551-6c62dcf6ec11-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm\" (UID: \"7e248019-bf73-4c6d-a551-6c62dcf6ec11\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.566252 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e248019-bf73-4c6d-a551-6c62dcf6ec11-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm\" (UID: \"7e248019-bf73-4c6d-a551-6c62dcf6ec11\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.566365 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01521e70-1366-4e52-9f9a-885522387a0e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm\" (UID: \"01521e70-1366-4e52-9f9a-885522387a0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.566648 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01521e70-1366-4e52-9f9a-885522387a0e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm\" (UID: \"01521e70-1366-4e52-9f9a-885522387a0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.572450 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.592718 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rgl9w"] Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.593549 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.596563 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-t2s7x" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.597043 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.598935 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca_0(d467fd16c7fbd4708284e3a342ad55b4b7cec5e4b8b5617e6952f6b5c641db3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.598983 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca_0(d467fd16c7fbd4708284e3a342ad55b4b7cec5e4b8b5617e6952f6b5c641db3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.599002 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca_0(d467fd16c7fbd4708284e3a342ad55b4b7cec5e4b8b5617e6952f6b5c641db3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.599036 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators(2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators(2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca_0(d467fd16c7fbd4708284e3a342ad55b4b7cec5e4b8b5617e6952f6b5c641db3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" podUID="2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.663891 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97ql\" (UniqueName: \"kubernetes.io/projected/216e104c-e7e3-4be4-972c-cd524973eaa6-kube-api-access-t97ql\") pod \"observability-operator-d8bb48f5d-rgl9w\" (UID: \"216e104c-e7e3-4be4-972c-cd524973eaa6\") " pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.663940 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/216e104c-e7e3-4be4-972c-cd524973eaa6-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rgl9w\" (UID: \"216e104c-e7e3-4be4-972c-cd524973eaa6\") " pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.701382 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.726701 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.733141 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators_7e248019-bf73-4c6d-a551-6c62dcf6ec11_0(9a70b7da8ca5fa0a6aa9aa753e5a9f5e370e3b94d50b1eb5504e6f0776e3cfe8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.733220 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators_7e248019-bf73-4c6d-a551-6c62dcf6ec11_0(9a70b7da8ca5fa0a6aa9aa753e5a9f5e370e3b94d50b1eb5504e6f0776e3cfe8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.733247 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators_7e248019-bf73-4c6d-a551-6c62dcf6ec11_0(9a70b7da8ca5fa0a6aa9aa753e5a9f5e370e3b94d50b1eb5504e6f0776e3cfe8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.733310 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators(7e248019-bf73-4c6d-a551-6c62dcf6ec11)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators(7e248019-bf73-4c6d-a551-6c62dcf6ec11)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators_7e248019-bf73-4c6d-a551-6c62dcf6ec11_0(9a70b7da8ca5fa0a6aa9aa753e5a9f5e370e3b94d50b1eb5504e6f0776e3cfe8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" podUID="7e248019-bf73-4c6d-a551-6c62dcf6ec11" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.758804 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators_01521e70-1366-4e52-9f9a-885522387a0e_0(44b074b39a9a3a485f73b25e5cdb3112c657b1b07bf0331029bbd643829b10a1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.758886 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators_01521e70-1366-4e52-9f9a-885522387a0e_0(44b074b39a9a3a485f73b25e5cdb3112c657b1b07bf0331029bbd643829b10a1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.758922 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators_01521e70-1366-4e52-9f9a-885522387a0e_0(44b074b39a9a3a485f73b25e5cdb3112c657b1b07bf0331029bbd643829b10a1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.759014 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators(01521e70-1366-4e52-9f9a-885522387a0e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators(01521e70-1366-4e52-9f9a-885522387a0e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators_01521e70-1366-4e52-9f9a-885522387a0e_0(44b074b39a9a3a485f73b25e5cdb3112c657b1b07bf0331029bbd643829b10a1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" podUID="01521e70-1366-4e52-9f9a-885522387a0e" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.765671 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97ql\" (UniqueName: \"kubernetes.io/projected/216e104c-e7e3-4be4-972c-cd524973eaa6-kube-api-access-t97ql\") pod \"observability-operator-d8bb48f5d-rgl9w\" (UID: \"216e104c-e7e3-4be4-972c-cd524973eaa6\") " pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.765733 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/216e104c-e7e3-4be4-972c-cd524973eaa6-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rgl9w\" (UID: \"216e104c-e7e3-4be4-972c-cd524973eaa6\") " pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.770208 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/216e104c-e7e3-4be4-972c-cd524973eaa6-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rgl9w\" (UID: \"216e104c-e7e3-4be4-972c-cd524973eaa6\") " pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.788145 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97ql\" (UniqueName: \"kubernetes.io/projected/216e104c-e7e3-4be4-972c-cd524973eaa6-kube-api-access-t97ql\") pod \"observability-operator-d8bb48f5d-rgl9w\" (UID: \"216e104c-e7e3-4be4-972c-cd524973eaa6\") " pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.792539 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-hq59z"] Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.793496 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.796432 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-vmm25" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.867372 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bb80946e-134f-4baa-b150-6004a9313de9-openshift-service-ca\") pod \"perses-operator-5446b9c989-hq59z\" (UID: \"bb80946e-134f-4baa-b150-6004a9313de9\") " pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.867419 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75dpl\" (UniqueName: \"kubernetes.io/projected/bb80946e-134f-4baa-b150-6004a9313de9-kube-api-access-75dpl\") pod \"perses-operator-5446b9c989-hq59z\" (UID: \"bb80946e-134f-4baa-b150-6004a9313de9\") " pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.910539 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.932744 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rgl9w_openshift-operators_216e104c-e7e3-4be4-972c-cd524973eaa6_0(5d6b398206eabee2c96d3c6a2f57394f3ed826c186caae2f0491ad13068e2e48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.932814 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rgl9w_openshift-operators_216e104c-e7e3-4be4-972c-cd524973eaa6_0(5d6b398206eabee2c96d3c6a2f57394f3ed826c186caae2f0491ad13068e2e48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.932839 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rgl9w_openshift-operators_216e104c-e7e3-4be4-972c-cd524973eaa6_0(5d6b398206eabee2c96d3c6a2f57394f3ed826c186caae2f0491ad13068e2e48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:53 crc kubenswrapper[4751]: E1203 14:25:53.932888 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-rgl9w_openshift-operators(216e104c-e7e3-4be4-972c-cd524973eaa6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-rgl9w_openshift-operators(216e104c-e7e3-4be4-972c-cd524973eaa6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rgl9w_openshift-operators_216e104c-e7e3-4be4-972c-cd524973eaa6_0(5d6b398206eabee2c96d3c6a2f57394f3ed826c186caae2f0491ad13068e2e48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" podUID="216e104c-e7e3-4be4-972c-cd524973eaa6" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.968810 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bb80946e-134f-4baa-b150-6004a9313de9-openshift-service-ca\") pod \"perses-operator-5446b9c989-hq59z\" (UID: \"bb80946e-134f-4baa-b150-6004a9313de9\") " pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.968873 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75dpl\" (UniqueName: \"kubernetes.io/projected/bb80946e-134f-4baa-b150-6004a9313de9-kube-api-access-75dpl\") pod \"perses-operator-5446b9c989-hq59z\" (UID: \"bb80946e-134f-4baa-b150-6004a9313de9\") " pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.970313 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bb80946e-134f-4baa-b150-6004a9313de9-openshift-service-ca\") pod \"perses-operator-5446b9c989-hq59z\" (UID: \"bb80946e-134f-4baa-b150-6004a9313de9\") " pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:53 crc kubenswrapper[4751]: I1203 14:25:53.998061 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75dpl\" (UniqueName: \"kubernetes.io/projected/bb80946e-134f-4baa-b150-6004a9313de9-kube-api-access-75dpl\") pod \"perses-operator-5446b9c989-hq59z\" (UID: \"bb80946e-134f-4baa-b150-6004a9313de9\") " pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:54 crc kubenswrapper[4751]: I1203 14:25:54.131654 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:54 crc kubenswrapper[4751]: I1203 14:25:54.142195 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerStarted","Data":"82249a2b15850f99de80f0a05ba358145bad07955964f58ccd032a45c6f9bb62"} Dec 03 14:25:54 crc kubenswrapper[4751]: I1203 14:25:54.142237 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerStarted","Data":"0296da4a49ba4b113b25b8277f44fb402c226fbe4f4805111602c5a6f6bca193"} Dec 03 14:25:54 crc kubenswrapper[4751]: I1203 14:25:54.142247 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerStarted","Data":"a1b8237992eaae05b06998eb14b57db4768668b862878c36c8567a21d9f70efb"} Dec 03 14:25:54 crc kubenswrapper[4751]: I1203 14:25:54.142257 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerStarted","Data":"3462a1e90277ecedbf91d4f9397b493ed9d7bad7d3869161a9a13c0f7eb88b12"} Dec 03 14:25:54 crc kubenswrapper[4751]: I1203 14:25:54.142279 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerStarted","Data":"41a1ae0f7acab2587b9d0f5f8939af23f3c236e8ffcd543cca94a85c5a82dfd1"} Dec 03 14:25:54 crc kubenswrapper[4751]: I1203 14:25:54.142290 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerStarted","Data":"a729fddd03ace2ee0796c8ad81dc7c9bb791474f45f2d4d604a15946cf24b8c8"} Dec 03 14:25:54 crc kubenswrapper[4751]: E1203 14:25:54.161549 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-hq59z_openshift-operators_bb80946e-134f-4baa-b150-6004a9313de9_0(0ab0fcc8198f8f9f8791625a499c7cf3e753ce4c1c81db384001b97c16d5e025): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:54 crc kubenswrapper[4751]: E1203 14:25:54.161627 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-hq59z_openshift-operators_bb80946e-134f-4baa-b150-6004a9313de9_0(0ab0fcc8198f8f9f8791625a499c7cf3e753ce4c1c81db384001b97c16d5e025): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:54 crc kubenswrapper[4751]: E1203 14:25:54.161656 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-hq59z_openshift-operators_bb80946e-134f-4baa-b150-6004a9313de9_0(0ab0fcc8198f8f9f8791625a499c7cf3e753ce4c1c81db384001b97c16d5e025): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:54 crc kubenswrapper[4751]: E1203 14:25:54.161730 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-hq59z_openshift-operators(bb80946e-134f-4baa-b150-6004a9313de9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-hq59z_openshift-operators(bb80946e-134f-4baa-b150-6004a9313de9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-hq59z_openshift-operators_bb80946e-134f-4baa-b150-6004a9313de9_0(0ab0fcc8198f8f9f8791625a499c7cf3e753ce4c1c81db384001b97c16d5e025): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-hq59z" podUID="bb80946e-134f-4baa-b150-6004a9313de9" Dec 03 14:25:54 crc kubenswrapper[4751]: I1203 14:25:54.514410 4751 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 14:25:57 crc kubenswrapper[4751]: I1203 14:25:57.166651 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerStarted","Data":"abb0ae5af7111d46930e425e7a24c593789308c3f7362c44478d3cfddaaeacd7"} Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.182178 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" event={"ID":"6b82b434-4871-4bd3-bb6b-37abc2d7d838","Type":"ContainerStarted","Data":"1c6be77d46e09d3984a54a89258bcaf8ee24a86e2738b92ee0963ce5acb79291"} Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.182713 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.182728 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.182737 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.188071 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bbng9"] Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.189386 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.219958 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" podStartSLOduration=7.219938882 podStartE2EDuration="7.219938882s" podCreationTimestamp="2025-12-03 14:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:25:59.218200135 +0000 UTC m=+766.206555372" watchObservedRunningTime="2025-12-03 14:25:59.219938882 +0000 UTC m=+766.208294099" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.221362 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.227452 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.227945 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-catalog-content\") pod \"redhat-marketplace-bbng9\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.228009 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rww\" (UniqueName: \"kubernetes.io/projected/061de325-d230-4a4a-a0bf-9d7a2e8f6429-kube-api-access-j6rww\") pod \"redhat-marketplace-bbng9\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.228139 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-utilities\") pod \"redhat-marketplace-bbng9\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.330142 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rww\" (UniqueName: \"kubernetes.io/projected/061de325-d230-4a4a-a0bf-9d7a2e8f6429-kube-api-access-j6rww\") pod \"redhat-marketplace-bbng9\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.330211 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-utilities\") pod \"redhat-marketplace-bbng9\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.330316 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-catalog-content\") pod \"redhat-marketplace-bbng9\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.330795 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-catalog-content\") pod \"redhat-marketplace-bbng9\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.331666 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-utilities\") pod \"redhat-marketplace-bbng9\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.350712 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rww\" (UniqueName: \"kubernetes.io/projected/061de325-d230-4a4a-a0bf-9d7a2e8f6429-kube-api-access-j6rww\") pod \"redhat-marketplace-bbng9\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.501045 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbng9"] Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.504941 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm"] Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.505279 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.505734 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.508862 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.514679 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm"] Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.514811 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.515299 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.517432 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rgl9w"] Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.517555 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.517952 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.543471 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch"] Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.543577 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.543934 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.560168 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-bbng9_openshift-marketplace_061de325-d230-4a4a-a0bf-9d7a2e8f6429_0(84beb127502431dbee7e61236f5f972ac96fee11ca0877cc5dff0900fb622494): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.560253 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-bbng9_openshift-marketplace_061de325-d230-4a4a-a0bf-9d7a2e8f6429_0(84beb127502431dbee7e61236f5f972ac96fee11ca0877cc5dff0900fb622494): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.560273 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-bbng9_openshift-marketplace_061de325-d230-4a4a-a0bf-9d7a2e8f6429_0(84beb127502431dbee7e61236f5f972ac96fee11ca0877cc5dff0900fb622494): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.560314 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-bbng9_openshift-marketplace(061de325-d230-4a4a-a0bf-9d7a2e8f6429)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-bbng9_openshift-marketplace(061de325-d230-4a4a-a0bf-9d7a2e8f6429)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-bbng9_openshift-marketplace_061de325-d230-4a4a-a0bf-9d7a2e8f6429_0(84beb127502431dbee7e61236f5f972ac96fee11ca0877cc5dff0900fb622494): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-marketplace-bbng9" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.560693 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-hq59z"] Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.560832 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:59 crc kubenswrapper[4751]: I1203 14:25:59.578539 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.604759 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rgl9w_openshift-operators_216e104c-e7e3-4be4-972c-cd524973eaa6_0(5b266e33a1856ad070bb49f264a01bd55ffde9ac73da013a48ab08e28f43c483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.604825 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rgl9w_openshift-operators_216e104c-e7e3-4be4-972c-cd524973eaa6_0(5b266e33a1856ad070bb49f264a01bd55ffde9ac73da013a48ab08e28f43c483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.604846 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rgl9w_openshift-operators_216e104c-e7e3-4be4-972c-cd524973eaa6_0(5b266e33a1856ad070bb49f264a01bd55ffde9ac73da013a48ab08e28f43c483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.604882 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-rgl9w_openshift-operators(216e104c-e7e3-4be4-972c-cd524973eaa6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-rgl9w_openshift-operators(216e104c-e7e3-4be4-972c-cd524973eaa6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-rgl9w_openshift-operators_216e104c-e7e3-4be4-972c-cd524973eaa6_0(5b266e33a1856ad070bb49f264a01bd55ffde9ac73da013a48ab08e28f43c483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" podUID="216e104c-e7e3-4be4-972c-cd524973eaa6" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.604773 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators_01521e70-1366-4e52-9f9a-885522387a0e_0(08bb6132177a1146ec4d3a6f0ac4739bf3b8158c5abbecd87e4363a39424552d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.605038 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators_01521e70-1366-4e52-9f9a-885522387a0e_0(08bb6132177a1146ec4d3a6f0ac4739bf3b8158c5abbecd87e4363a39424552d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.605105 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators_01521e70-1366-4e52-9f9a-885522387a0e_0(08bb6132177a1146ec4d3a6f0ac4739bf3b8158c5abbecd87e4363a39424552d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.605196 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators(01521e70-1366-4e52-9f9a-885522387a0e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators(01521e70-1366-4e52-9f9a-885522387a0e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_openshift-operators_01521e70-1366-4e52-9f9a-885522387a0e_0(08bb6132177a1146ec4d3a6f0ac4739bf3b8158c5abbecd87e4363a39424552d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" podUID="01521e70-1366-4e52-9f9a-885522387a0e" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.664244 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators_7e248019-bf73-4c6d-a551-6c62dcf6ec11_0(cc40b89bb43a276c3fbf03e7f928d195c2c566892ab58a8cf35b201409b90ef6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.664335 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators_7e248019-bf73-4c6d-a551-6c62dcf6ec11_0(cc40b89bb43a276c3fbf03e7f928d195c2c566892ab58a8cf35b201409b90ef6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.664362 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators_7e248019-bf73-4c6d-a551-6c62dcf6ec11_0(cc40b89bb43a276c3fbf03e7f928d195c2c566892ab58a8cf35b201409b90ef6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.664428 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators(7e248019-bf73-4c6d-a551-6c62dcf6ec11)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators(7e248019-bf73-4c6d-a551-6c62dcf6ec11)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_openshift-operators_7e248019-bf73-4c6d-a551-6c62dcf6ec11_0(cc40b89bb43a276c3fbf03e7f928d195c2c566892ab58a8cf35b201409b90ef6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" podUID="7e248019-bf73-4c6d-a551-6c62dcf6ec11" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.687516 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca_0(04803a11e1d760ba2d91b4617a54995111bee3805580e22aa8ca6eff0c2710d8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.687570 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca_0(04803a11e1d760ba2d91b4617a54995111bee3805580e22aa8ca6eff0c2710d8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.687589 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca_0(04803a11e1d760ba2d91b4617a54995111bee3805580e22aa8ca6eff0c2710d8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.687627 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators(2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators(2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-96kch_openshift-operators_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca_0(04803a11e1d760ba2d91b4617a54995111bee3805580e22aa8ca6eff0c2710d8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" podUID="2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.695649 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-hq59z_openshift-operators_bb80946e-134f-4baa-b150-6004a9313de9_0(ac4a08b162e507c71c20506ac314084b4cf4fc9fd2eedaca1ef4f343e71f0986): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.695725 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-hq59z_openshift-operators_bb80946e-134f-4baa-b150-6004a9313de9_0(ac4a08b162e507c71c20506ac314084b4cf4fc9fd2eedaca1ef4f343e71f0986): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.695762 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-hq59z_openshift-operators_bb80946e-134f-4baa-b150-6004a9313de9_0(ac4a08b162e507c71c20506ac314084b4cf4fc9fd2eedaca1ef4f343e71f0986): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:25:59 crc kubenswrapper[4751]: E1203 14:25:59.695800 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-hq59z_openshift-operators(bb80946e-134f-4baa-b150-6004a9313de9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-hq59z_openshift-operators(bb80946e-134f-4baa-b150-6004a9313de9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-hq59z_openshift-operators_bb80946e-134f-4baa-b150-6004a9313de9_0(ac4a08b162e507c71c20506ac314084b4cf4fc9fd2eedaca1ef4f343e71f0986): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-hq59z" podUID="bb80946e-134f-4baa-b150-6004a9313de9" Dec 03 14:26:00 crc kubenswrapper[4751]: I1203 14:26:00.187709 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:00 crc kubenswrapper[4751]: I1203 14:26:00.188988 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:00 crc kubenswrapper[4751]: E1203 14:26:00.210240 4751 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-bbng9_openshift-marketplace_061de325-d230-4a4a-a0bf-9d7a2e8f6429_0(f1e76f3672a68bb1fa6bc8c06dcc436ebce170b00c4e99b1d50300a9d15fb9db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:26:00 crc kubenswrapper[4751]: E1203 14:26:00.210308 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-bbng9_openshift-marketplace_061de325-d230-4a4a-a0bf-9d7a2e8f6429_0(f1e76f3672a68bb1fa6bc8c06dcc436ebce170b00c4e99b1d50300a9d15fb9db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:00 crc kubenswrapper[4751]: E1203 14:26:00.210343 4751 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-bbng9_openshift-marketplace_061de325-d230-4a4a-a0bf-9d7a2e8f6429_0(f1e76f3672a68bb1fa6bc8c06dcc436ebce170b00c4e99b1d50300a9d15fb9db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:00 crc kubenswrapper[4751]: E1203 14:26:00.210388 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-bbng9_openshift-marketplace(061de325-d230-4a4a-a0bf-9d7a2e8f6429)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-bbng9_openshift-marketplace(061de325-d230-4a4a-a0bf-9d7a2e8f6429)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-bbng9_openshift-marketplace_061de325-d230-4a4a-a0bf-9d7a2e8f6429_0(f1e76f3672a68bb1fa6bc8c06dcc436ebce170b00c4e99b1d50300a9d15fb9db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-marketplace-bbng9" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.571845 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fxk9t"] Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.573140 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.584816 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxk9t"] Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.589722 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6cdc\" (UniqueName: \"kubernetes.io/projected/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-kube-api-access-g6cdc\") pod \"certified-operators-fxk9t\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.589791 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-catalog-content\") pod \"certified-operators-fxk9t\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.589868 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-utilities\") pod \"certified-operators-fxk9t\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.691201 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6cdc\" (UniqueName: \"kubernetes.io/projected/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-kube-api-access-g6cdc\") pod \"certified-operators-fxk9t\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.691263 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-catalog-content\") pod \"certified-operators-fxk9t\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.691369 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-utilities\") pod \"certified-operators-fxk9t\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.691770 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-catalog-content\") pod \"certified-operators-fxk9t\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.691874 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-utilities\") pod \"certified-operators-fxk9t\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.711295 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6cdc\" (UniqueName: \"kubernetes.io/projected/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-kube-api-access-g6cdc\") pod \"certified-operators-fxk9t\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:02 crc kubenswrapper[4751]: I1203 14:26:02.898511 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:03 crc kubenswrapper[4751]: I1203 14:26:03.123145 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxk9t"] Dec 03 14:26:03 crc kubenswrapper[4751]: W1203 14:26:03.127232 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21a0b76_6f04_47c5_8d3b_f8231398b3d5.slice/crio-607178022ef6eac119584cf0426ca56f725e76a494bcba135f2a502386c1eb31 WatchSource:0}: Error finding container 607178022ef6eac119584cf0426ca56f725e76a494bcba135f2a502386c1eb31: Status 404 returned error can't find the container with id 607178022ef6eac119584cf0426ca56f725e76a494bcba135f2a502386c1eb31 Dec 03 14:26:03 crc kubenswrapper[4751]: I1203 14:26:03.202876 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk9t" event={"ID":"a21a0b76-6f04-47c5-8d3b-f8231398b3d5","Type":"ContainerStarted","Data":"607178022ef6eac119584cf0426ca56f725e76a494bcba135f2a502386c1eb31"} Dec 03 14:26:05 crc kubenswrapper[4751]: I1203 14:26:05.216114 4751 generic.go:334] "Generic (PLEG): container finished" podID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerID="8e5dd255e47b36c57c5f3af1b94d737d91c35ee1c4a263b09178d16056ae4797" exitCode=0 Dec 03 14:26:05 crc kubenswrapper[4751]: I1203 14:26:05.216178 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk9t" event={"ID":"a21a0b76-6f04-47c5-8d3b-f8231398b3d5","Type":"ContainerDied","Data":"8e5dd255e47b36c57c5f3af1b94d737d91c35ee1c4a263b09178d16056ae4797"} Dec 03 14:26:06 crc kubenswrapper[4751]: I1203 14:26:06.223343 4751 generic.go:334] "Generic (PLEG): container finished" podID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerID="11978c01cf942cdfbe2b2ada2133fad500b5432cd877081a09d044f319fdb660" exitCode=0 Dec 03 14:26:06 crc kubenswrapper[4751]: I1203 14:26:06.223409 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk9t" event={"ID":"a21a0b76-6f04-47c5-8d3b-f8231398b3d5","Type":"ContainerDied","Data":"11978c01cf942cdfbe2b2ada2133fad500b5432cd877081a09d044f319fdb660"} Dec 03 14:26:07 crc kubenswrapper[4751]: I1203 14:26:07.230774 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk9t" event={"ID":"a21a0b76-6f04-47c5-8d3b-f8231398b3d5","Type":"ContainerStarted","Data":"758b60b965a15d91987ad3b534b8263b6eb7c4ec794a4c9f086dbdde03786907"} Dec 03 14:26:07 crc kubenswrapper[4751]: I1203 14:26:07.254885 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fxk9t" podStartSLOduration=3.769256447 podStartE2EDuration="5.254860162s" podCreationTimestamp="2025-12-03 14:26:02 +0000 UTC" firstStartedPulling="2025-12-03 14:26:05.218174163 +0000 UTC m=+772.206529400" lastFinishedPulling="2025-12-03 14:26:06.703777898 +0000 UTC m=+773.692133115" observedRunningTime="2025-12-03 14:26:07.251020028 +0000 UTC m=+774.239375255" watchObservedRunningTime="2025-12-03 14:26:07.254860162 +0000 UTC m=+774.243215379" Dec 03 14:26:10 crc kubenswrapper[4751]: I1203 14:26:10.313897 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:26:10 crc kubenswrapper[4751]: I1203 14:26:10.314421 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" Dec 03 14:26:10 crc kubenswrapper[4751]: I1203 14:26:10.584110 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch"] Dec 03 14:26:11 crc kubenswrapper[4751]: I1203 14:26:11.252318 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" event={"ID":"2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca","Type":"ContainerStarted","Data":"7826429cf3f8259bcf2ede1bd564448a29f7d6fcf988ee18241e819c13322e11"} Dec 03 14:26:11 crc kubenswrapper[4751]: I1203 14:26:11.313454 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:26:11 crc kubenswrapper[4751]: I1203 14:26:11.313546 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:26:11 crc kubenswrapper[4751]: I1203 14:26:11.314085 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" Dec 03 14:26:11 crc kubenswrapper[4751]: I1203 14:26:11.314512 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" Dec 03 14:26:11 crc kubenswrapper[4751]: I1203 14:26:11.581299 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm"] Dec 03 14:26:11 crc kubenswrapper[4751]: I1203 14:26:11.628287 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm"] Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.259533 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" event={"ID":"7e248019-bf73-4c6d-a551-6c62dcf6ec11","Type":"ContainerStarted","Data":"ce71b780fe7febeeb2b2be6863709ac5733c2a525febe624af3689692ab217e7"} Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.261105 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" event={"ID":"01521e70-1366-4e52-9f9a-885522387a0e","Type":"ContainerStarted","Data":"3bcb4fa02196163c49bbc543e6b0065fee7d40da9e27a559d74dbb79f43e2632"} Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.313441 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.313465 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.313859 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.314026 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.532868 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rgl9w"] Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.578407 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbng9"] Dec 03 14:26:12 crc kubenswrapper[4751]: W1203 14:26:12.581046 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod061de325_d230_4a4a_a0bf_9d7a2e8f6429.slice/crio-1819e149cbd6a7a104d07b360852e6bea32100092531ed2223652411e7ba9f23 WatchSource:0}: Error finding container 1819e149cbd6a7a104d07b360852e6bea32100092531ed2223652411e7ba9f23: Status 404 returned error can't find the container with id 1819e149cbd6a7a104d07b360852e6bea32100092531ed2223652411e7ba9f23 Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.899642 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.899943 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:12 crc kubenswrapper[4751]: I1203 14:26:12.942988 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:13 crc kubenswrapper[4751]: I1203 14:26:13.269130 4751 generic.go:334] "Generic (PLEG): container finished" podID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerID="4bb8891fe88c22e1d86dad489b02a65e6acedac8c265ea25890fe7218dca7edf" exitCode=0 Dec 03 14:26:13 crc kubenswrapper[4751]: I1203 14:26:13.269259 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbng9" event={"ID":"061de325-d230-4a4a-a0bf-9d7a2e8f6429","Type":"ContainerDied","Data":"4bb8891fe88c22e1d86dad489b02a65e6acedac8c265ea25890fe7218dca7edf"} Dec 03 14:26:13 crc kubenswrapper[4751]: I1203 14:26:13.269441 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbng9" event={"ID":"061de325-d230-4a4a-a0bf-9d7a2e8f6429","Type":"ContainerStarted","Data":"1819e149cbd6a7a104d07b360852e6bea32100092531ed2223652411e7ba9f23"} Dec 03 14:26:13 crc kubenswrapper[4751]: I1203 14:26:13.272357 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" event={"ID":"216e104c-e7e3-4be4-972c-cd524973eaa6","Type":"ContainerStarted","Data":"de434d27a387db8ea7818a415699ada673bceee18ee26311b30b39eea3e9600a"} Dec 03 14:26:13 crc kubenswrapper[4751]: I1203 14:26:13.325882 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:14 crc kubenswrapper[4751]: I1203 14:26:14.282605 4751 generic.go:334] "Generic (PLEG): container finished" podID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerID="2791d63524107db31e30c617f3166256d091f3193425513cab40404169987032" exitCode=0 Dec 03 14:26:14 crc kubenswrapper[4751]: I1203 14:26:14.282661 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbng9" event={"ID":"061de325-d230-4a4a-a0bf-9d7a2e8f6429","Type":"ContainerDied","Data":"2791d63524107db31e30c617f3166256d091f3193425513cab40404169987032"} Dec 03 14:26:14 crc kubenswrapper[4751]: I1203 14:26:14.312922 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:26:14 crc kubenswrapper[4751]: I1203 14:26:14.313593 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:26:14 crc kubenswrapper[4751]: I1203 14:26:14.745449 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-hq59z"] Dec 03 14:26:14 crc kubenswrapper[4751]: W1203 14:26:14.751308 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb80946e_134f_4baa_b150_6004a9313de9.slice/crio-7f3e5c98ed71fbc6dd1a4fb550cd3fcf8d4e6355180b0897de3638a7ef3ac74b WatchSource:0}: Error finding container 7f3e5c98ed71fbc6dd1a4fb550cd3fcf8d4e6355180b0897de3638a7ef3ac74b: Status 404 returned error can't find the container with id 7f3e5c98ed71fbc6dd1a4fb550cd3fcf8d4e6355180b0897de3638a7ef3ac74b Dec 03 14:26:15 crc kubenswrapper[4751]: I1203 14:26:15.168031 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fxk9t"] Dec 03 14:26:15 crc kubenswrapper[4751]: I1203 14:26:15.291074 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-hq59z" event={"ID":"bb80946e-134f-4baa-b150-6004a9313de9","Type":"ContainerStarted","Data":"7f3e5c98ed71fbc6dd1a4fb550cd3fcf8d4e6355180b0897de3638a7ef3ac74b"} Dec 03 14:26:15 crc kubenswrapper[4751]: I1203 14:26:15.291245 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fxk9t" podUID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerName="registry-server" containerID="cri-o://758b60b965a15d91987ad3b534b8263b6eb7c4ec794a4c9f086dbdde03786907" gracePeriod=2 Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.181897 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86mbc"] Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.183062 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.193465 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86mbc"] Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.300008 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbng9" event={"ID":"061de325-d230-4a4a-a0bf-9d7a2e8f6429","Type":"ContainerStarted","Data":"27afd11f1c837529d6f58c74b320e4d10bd7628403b902fd3b4df9fbef4f3cc2"} Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.302639 4751 generic.go:334] "Generic (PLEG): container finished" podID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerID="758b60b965a15d91987ad3b534b8263b6eb7c4ec794a4c9f086dbdde03786907" exitCode=0 Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.302678 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk9t" event={"ID":"a21a0b76-6f04-47c5-8d3b-f8231398b3d5","Type":"ContainerDied","Data":"758b60b965a15d91987ad3b534b8263b6eb7c4ec794a4c9f086dbdde03786907"} Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.321925 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bbng9" podStartSLOduration=15.01917193 podStartE2EDuration="17.321902692s" podCreationTimestamp="2025-12-03 14:25:59 +0000 UTC" firstStartedPulling="2025-12-03 14:26:13.271665096 +0000 UTC m=+780.260020313" lastFinishedPulling="2025-12-03 14:26:15.574395858 +0000 UTC m=+782.562751075" observedRunningTime="2025-12-03 14:26:16.320811103 +0000 UTC m=+783.309166320" watchObservedRunningTime="2025-12-03 14:26:16.321902692 +0000 UTC m=+783.310257909" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.363813 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-catalog-content\") pod \"community-operators-86mbc\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.363883 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpcmd\" (UniqueName: \"kubernetes.io/projected/d36638a2-eb05-47f5-87a6-db726a98e244-kube-api-access-kpcmd\") pod \"community-operators-86mbc\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.364302 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-utilities\") pod \"community-operators-86mbc\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.467357 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-utilities\") pod \"community-operators-86mbc\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.467418 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-catalog-content\") pod \"community-operators-86mbc\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.467485 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpcmd\" (UniqueName: \"kubernetes.io/projected/d36638a2-eb05-47f5-87a6-db726a98e244-kube-api-access-kpcmd\") pod \"community-operators-86mbc\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.467966 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-utilities\") pod \"community-operators-86mbc\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.468182 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-catalog-content\") pod \"community-operators-86mbc\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.502718 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpcmd\" (UniqueName: \"kubernetes.io/projected/d36638a2-eb05-47f5-87a6-db726a98e244-kube-api-access-kpcmd\") pod \"community-operators-86mbc\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:16 crc kubenswrapper[4751]: I1203 14:26:16.797708 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:19 crc kubenswrapper[4751]: I1203 14:26:19.509424 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:19 crc kubenswrapper[4751]: I1203 14:26:19.509858 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:19 crc kubenswrapper[4751]: I1203 14:26:19.553182 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:20 crc kubenswrapper[4751]: I1203 14:26:20.364927 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.084673 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.167373 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbng9"] Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.225875 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6cdc\" (UniqueName: \"kubernetes.io/projected/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-kube-api-access-g6cdc\") pod \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.226010 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-utilities\") pod \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.226093 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-catalog-content\") pod \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\" (UID: \"a21a0b76-6f04-47c5-8d3b-f8231398b3d5\") " Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.227218 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-utilities" (OuterVolumeSpecName: "utilities") pod "a21a0b76-6f04-47c5-8d3b-f8231398b3d5" (UID: "a21a0b76-6f04-47c5-8d3b-f8231398b3d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.231816 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-kube-api-access-g6cdc" (OuterVolumeSpecName: "kube-api-access-g6cdc") pod "a21a0b76-6f04-47c5-8d3b-f8231398b3d5" (UID: "a21a0b76-6f04-47c5-8d3b-f8231398b3d5"). InnerVolumeSpecName "kube-api-access-g6cdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.283730 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a21a0b76-6f04-47c5-8d3b-f8231398b3d5" (UID: "a21a0b76-6f04-47c5-8d3b-f8231398b3d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.327303 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6cdc\" (UniqueName: \"kubernetes.io/projected/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-kube-api-access-g6cdc\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.327362 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.327376 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21a0b76-6f04-47c5-8d3b-f8231398b3d5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.330149 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk9t" event={"ID":"a21a0b76-6f04-47c5-8d3b-f8231398b3d5","Type":"ContainerDied","Data":"607178022ef6eac119584cf0426ca56f725e76a494bcba135f2a502386c1eb31"} Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.330173 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxk9t" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.330206 4751 scope.go:117] "RemoveContainer" containerID="758b60b965a15d91987ad3b534b8263b6eb7c4ec794a4c9f086dbdde03786907" Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.357860 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fxk9t"] Dec 03 14:26:21 crc kubenswrapper[4751]: I1203 14:26:21.360578 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fxk9t"] Dec 03 14:26:22 crc kubenswrapper[4751]: I1203 14:26:22.335503 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bbng9" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerName="registry-server" containerID="cri-o://27afd11f1c837529d6f58c74b320e4d10bd7628403b902fd3b4df9fbef4f3cc2" gracePeriod=2 Dec 03 14:26:22 crc kubenswrapper[4751]: I1203 14:26:22.766131 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gqmr9" Dec 03 14:26:23 crc kubenswrapper[4751]: I1203 14:26:23.328370 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" path="/var/lib/kubelet/pods/a21a0b76-6f04-47c5-8d3b-f8231398b3d5/volumes" Dec 03 14:26:23 crc kubenswrapper[4751]: I1203 14:26:23.352874 4751 generic.go:334] "Generic (PLEG): container finished" podID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerID="27afd11f1c837529d6f58c74b320e4d10bd7628403b902fd3b4df9fbef4f3cc2" exitCode=0 Dec 03 14:26:23 crc kubenswrapper[4751]: I1203 14:26:23.352925 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbng9" event={"ID":"061de325-d230-4a4a-a0bf-9d7a2e8f6429","Type":"ContainerDied","Data":"27afd11f1c837529d6f58c74b320e4d10bd7628403b902fd3b4df9fbef4f3cc2"} Dec 03 14:26:23 crc kubenswrapper[4751]: I1203 14:26:23.995108 4751 scope.go:117] "RemoveContainer" containerID="11978c01cf942cdfbe2b2ada2133fad500b5432cd877081a09d044f319fdb660" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.033365 4751 scope.go:117] "RemoveContainer" containerID="8e5dd255e47b36c57c5f3af1b94d737d91c35ee1c4a263b09178d16056ae4797" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.136013 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.266206 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86mbc"] Dec 03 14:26:24 crc kubenswrapper[4751]: W1203 14:26:24.272683 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd36638a2_eb05_47f5_87a6_db726a98e244.slice/crio-0226a3135861fe4211f481f8fe4a0ae73aa1a140dfe2ba6bd48d72bbaa33dac8 WatchSource:0}: Error finding container 0226a3135861fe4211f481f8fe4a0ae73aa1a140dfe2ba6bd48d72bbaa33dac8: Status 404 returned error can't find the container with id 0226a3135861fe4211f481f8fe4a0ae73aa1a140dfe2ba6bd48d72bbaa33dac8 Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.273104 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-catalog-content\") pod \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.275537 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-utilities\") pod \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.275591 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6rww\" (UniqueName: \"kubernetes.io/projected/061de325-d230-4a4a-a0bf-9d7a2e8f6429-kube-api-access-j6rww\") pod \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\" (UID: \"061de325-d230-4a4a-a0bf-9d7a2e8f6429\") " Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.277107 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-utilities" (OuterVolumeSpecName: "utilities") pod "061de325-d230-4a4a-a0bf-9d7a2e8f6429" (UID: "061de325-d230-4a4a-a0bf-9d7a2e8f6429"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.281966 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061de325-d230-4a4a-a0bf-9d7a2e8f6429-kube-api-access-j6rww" (OuterVolumeSpecName: "kube-api-access-j6rww") pod "061de325-d230-4a4a-a0bf-9d7a2e8f6429" (UID: "061de325-d230-4a4a-a0bf-9d7a2e8f6429"). InnerVolumeSpecName "kube-api-access-j6rww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.291618 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "061de325-d230-4a4a-a0bf-9d7a2e8f6429" (UID: "061de325-d230-4a4a-a0bf-9d7a2e8f6429"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.359188 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" event={"ID":"01521e70-1366-4e52-9f9a-885522387a0e","Type":"ContainerStarted","Data":"a1a1ba0f650027acda8d9b7280c96b62bdd01305d8662916ccf98a178c1c6c5e"} Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.362151 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" event={"ID":"216e104c-e7e3-4be4-972c-cd524973eaa6","Type":"ContainerStarted","Data":"7c11103c9f37a39ddeba67cd78d048f80dd8b210aeeb21fd7fa4f46003be26a2"} Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.362552 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.363916 4751 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-rgl9w container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.25:8081/healthz\": dial tcp 10.217.0.25:8081: connect: connection refused" start-of-body= Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.363967 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" podUID="216e104c-e7e3-4be4-972c-cd524973eaa6" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.25:8081/healthz\": dial tcp 10.217.0.25:8081: connect: connection refused" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.365392 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mbc" event={"ID":"d36638a2-eb05-47f5-87a6-db726a98e244","Type":"ContainerStarted","Data":"0226a3135861fe4211f481f8fe4a0ae73aa1a140dfe2ba6bd48d72bbaa33dac8"} Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.369038 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" event={"ID":"2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca","Type":"ContainerStarted","Data":"ce4abf487f2512272b36b77aa4bfe11b4455198c60ac51192ce0d9a2883c1015"} Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.372845 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" event={"ID":"7e248019-bf73-4c6d-a551-6c62dcf6ec11","Type":"ContainerStarted","Data":"c07fc71e6e0303dde22a207d05300decc750e91689a5df37ce83eedafd403772"} Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.376246 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbng9" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.376246 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbng9" event={"ID":"061de325-d230-4a4a-a0bf-9d7a2e8f6429","Type":"ContainerDied","Data":"1819e149cbd6a7a104d07b360852e6bea32100092531ed2223652411e7ba9f23"} Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.376391 4751 scope.go:117] "RemoveContainer" containerID="27afd11f1c837529d6f58c74b320e4d10bd7628403b902fd3b4df9fbef4f3cc2" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.376873 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.376915 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061de325-d230-4a4a-a0bf-9d7a2e8f6429-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.376933 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6rww\" (UniqueName: \"kubernetes.io/projected/061de325-d230-4a4a-a0bf-9d7a2e8f6429-kube-api-access-j6rww\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.379760 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-hq59z" event={"ID":"bb80946e-134f-4baa-b150-6004a9313de9","Type":"ContainerStarted","Data":"d94985e2a07cf71c00ceea2351910565e7981d7d5f9ce23518cfa784a1780a99"} Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.379970 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.387708 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm" podStartSLOduration=18.939818579 podStartE2EDuration="31.387689931s" podCreationTimestamp="2025-12-03 14:25:53 +0000 UTC" firstStartedPulling="2025-12-03 14:26:11.594302583 +0000 UTC m=+778.582657800" lastFinishedPulling="2025-12-03 14:26:24.042173935 +0000 UTC m=+791.030529152" observedRunningTime="2025-12-03 14:26:24.383444766 +0000 UTC m=+791.371799993" watchObservedRunningTime="2025-12-03 14:26:24.387689931 +0000 UTC m=+791.376045148" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.400675 4751 scope.go:117] "RemoveContainer" containerID="2791d63524107db31e30c617f3166256d091f3193425513cab40404169987032" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.411872 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-96kch" podStartSLOduration=17.967042461 podStartE2EDuration="31.411854176s" podCreationTimestamp="2025-12-03 14:25:53 +0000 UTC" firstStartedPulling="2025-12-03 14:26:10.598011137 +0000 UTC m=+777.586366354" lastFinishedPulling="2025-12-03 14:26:24.042822852 +0000 UTC m=+791.031178069" observedRunningTime="2025-12-03 14:26:24.410309124 +0000 UTC m=+791.398664341" watchObservedRunningTime="2025-12-03 14:26:24.411854176 +0000 UTC m=+791.400209403" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.442622 4751 scope.go:117] "RemoveContainer" containerID="4bb8891fe88c22e1d86dad489b02a65e6acedac8c265ea25890fe7218dca7edf" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.462630 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm" podStartSLOduration=19.008520902 podStartE2EDuration="31.462612402s" podCreationTimestamp="2025-12-03 14:25:53 +0000 UTC" firstStartedPulling="2025-12-03 14:26:11.638188693 +0000 UTC m=+778.626543910" lastFinishedPulling="2025-12-03 14:26:24.092280203 +0000 UTC m=+791.080635410" observedRunningTime="2025-12-03 14:26:24.434464109 +0000 UTC m=+791.422819346" watchObservedRunningTime="2025-12-03 14:26:24.462612402 +0000 UTC m=+791.450967619" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.463420 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-hq59z" podStartSLOduration=22.17420837 podStartE2EDuration="31.463413644s" podCreationTimestamp="2025-12-03 14:25:53 +0000 UTC" firstStartedPulling="2025-12-03 14:26:14.7536666 +0000 UTC m=+781.742021817" lastFinishedPulling="2025-12-03 14:26:24.042871864 +0000 UTC m=+791.031227091" observedRunningTime="2025-12-03 14:26:24.461728338 +0000 UTC m=+791.450083595" watchObservedRunningTime="2025-12-03 14:26:24.463413644 +0000 UTC m=+791.451768851" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.505448 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" podStartSLOduration=19.941220388 podStartE2EDuration="31.505424643s" podCreationTimestamp="2025-12-03 14:25:53 +0000 UTC" firstStartedPulling="2025-12-03 14:26:12.54399733 +0000 UTC m=+779.532352547" lastFinishedPulling="2025-12-03 14:26:24.108201585 +0000 UTC m=+791.096556802" observedRunningTime="2025-12-03 14:26:24.487267311 +0000 UTC m=+791.475622528" watchObservedRunningTime="2025-12-03 14:26:24.505424643 +0000 UTC m=+791.493779860" Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.512122 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbng9"] Dec 03 14:26:24 crc kubenswrapper[4751]: I1203 14:26:24.517223 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbng9"] Dec 03 14:26:25 crc kubenswrapper[4751]: I1203 14:26:25.323000 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" path="/var/lib/kubelet/pods/061de325-d230-4a4a-a0bf-9d7a2e8f6429/volumes" Dec 03 14:26:25 crc kubenswrapper[4751]: I1203 14:26:25.386935 4751 generic.go:334] "Generic (PLEG): container finished" podID="d36638a2-eb05-47f5-87a6-db726a98e244" containerID="d3b69891c130a10f3799b27b79605c223a0b4d0426a3a70e35fea0a1d1d7b630" exitCode=0 Dec 03 14:26:25 crc kubenswrapper[4751]: I1203 14:26:25.387004 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mbc" event={"ID":"d36638a2-eb05-47f5-87a6-db726a98e244","Type":"ContainerDied","Data":"d3b69891c130a10f3799b27b79605c223a0b4d0426a3a70e35fea0a1d1d7b630"} Dec 03 14:26:25 crc kubenswrapper[4751]: I1203 14:26:25.390989 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-rgl9w" Dec 03 14:26:26 crc kubenswrapper[4751]: I1203 14:26:26.399843 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mbc" event={"ID":"d36638a2-eb05-47f5-87a6-db726a98e244","Type":"ContainerStarted","Data":"e960ee3922c928d7a236f36692dfb754b0a0926cef03372e9e26e09de359b4b4"} Dec 03 14:26:27 crc kubenswrapper[4751]: I1203 14:26:27.406987 4751 generic.go:334] "Generic (PLEG): container finished" podID="d36638a2-eb05-47f5-87a6-db726a98e244" containerID="e960ee3922c928d7a236f36692dfb754b0a0926cef03372e9e26e09de359b4b4" exitCode=0 Dec 03 14:26:27 crc kubenswrapper[4751]: I1203 14:26:27.407175 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mbc" event={"ID":"d36638a2-eb05-47f5-87a6-db726a98e244","Type":"ContainerDied","Data":"e960ee3922c928d7a236f36692dfb754b0a0926cef03372e9e26e09de359b4b4"} Dec 03 14:26:28 crc kubenswrapper[4751]: I1203 14:26:28.414276 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mbc" event={"ID":"d36638a2-eb05-47f5-87a6-db726a98e244","Type":"ContainerStarted","Data":"367694b5df47668d5d0ad534cbdbdfb195f6f50b28fe7856e9adf4972e823c07"} Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.519995 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-86mbc" podStartSLOduration=14.883969694 podStartE2EDuration="17.51997109s" podCreationTimestamp="2025-12-03 14:26:16 +0000 UTC" firstStartedPulling="2025-12-03 14:26:25.389427867 +0000 UTC m=+792.377783124" lastFinishedPulling="2025-12-03 14:26:28.025429303 +0000 UTC m=+795.013784520" observedRunningTime="2025-12-03 14:26:28.434705258 +0000 UTC m=+795.423060475" watchObservedRunningTime="2025-12-03 14:26:33.51997109 +0000 UTC m=+800.508326337" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.525645 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-rwllx"] Dec 03 14:26:33 crc kubenswrapper[4751]: E1203 14:26:33.525950 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerName="extract-utilities" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.525968 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerName="extract-utilities" Dec 03 14:26:33 crc kubenswrapper[4751]: E1203 14:26:33.525989 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerName="extract-content" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.526002 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerName="extract-content" Dec 03 14:26:33 crc kubenswrapper[4751]: E1203 14:26:33.526021 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerName="registry-server" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.526033 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerName="registry-server" Dec 03 14:26:33 crc kubenswrapper[4751]: E1203 14:26:33.526059 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerName="extract-utilities" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.526071 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerName="extract-utilities" Dec 03 14:26:33 crc kubenswrapper[4751]: E1203 14:26:33.526091 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerName="registry-server" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.526103 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerName="registry-server" Dec 03 14:26:33 crc kubenswrapper[4751]: E1203 14:26:33.526121 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerName="extract-content" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.526133 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerName="extract-content" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.526367 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21a0b76-6f04-47c5-8d3b-f8231398b3d5" containerName="registry-server" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.526390 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="061de325-d230-4a4a-a0bf-9d7a2e8f6429" containerName="registry-server" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.526993 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-rwllx" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.528923 4751 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-cd6gg" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.529354 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.538020 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wb55f"] Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.539173 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wb55f" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.540281 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.541246 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-rwllx"] Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.545603 4751 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pkl5k" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.552173 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wb55f"] Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.561530 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-7jwb7"] Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.562455 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.565122 4751 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rkp8t" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.586389 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-7jwb7"] Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.692581 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77tjv\" (UniqueName: \"kubernetes.io/projected/6bc5bb21-6b5f-4b06-a96a-2e5883752c9a-kube-api-access-77tjv\") pod \"cert-manager-5b446d88c5-wb55f\" (UID: \"6bc5bb21-6b5f-4b06-a96a-2e5883752c9a\") " pod="cert-manager/cert-manager-5b446d88c5-wb55f" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.692825 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pjpm\" (UniqueName: \"kubernetes.io/projected/d42519d9-b7e8-4c0b-bcac-b4269faf605a-kube-api-access-2pjpm\") pod \"cert-manager-webhook-5655c58dd6-7jwb7\" (UID: \"d42519d9-b7e8-4c0b-bcac-b4269faf605a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.692936 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r56nk\" (UniqueName: \"kubernetes.io/projected/c09475fb-946d-45c4-8482-4db508ae7459-kube-api-access-r56nk\") pod \"cert-manager-cainjector-7f985d654d-rwllx\" (UID: \"c09475fb-946d-45c4-8482-4db508ae7459\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-rwllx" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.794040 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pjpm\" (UniqueName: \"kubernetes.io/projected/d42519d9-b7e8-4c0b-bcac-b4269faf605a-kube-api-access-2pjpm\") pod \"cert-manager-webhook-5655c58dd6-7jwb7\" (UID: \"d42519d9-b7e8-4c0b-bcac-b4269faf605a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.794092 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r56nk\" (UniqueName: \"kubernetes.io/projected/c09475fb-946d-45c4-8482-4db508ae7459-kube-api-access-r56nk\") pod \"cert-manager-cainjector-7f985d654d-rwllx\" (UID: \"c09475fb-946d-45c4-8482-4db508ae7459\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-rwllx" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.794147 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77tjv\" (UniqueName: \"kubernetes.io/projected/6bc5bb21-6b5f-4b06-a96a-2e5883752c9a-kube-api-access-77tjv\") pod \"cert-manager-5b446d88c5-wb55f\" (UID: \"6bc5bb21-6b5f-4b06-a96a-2e5883752c9a\") " pod="cert-manager/cert-manager-5b446d88c5-wb55f" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.816712 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pjpm\" (UniqueName: \"kubernetes.io/projected/d42519d9-b7e8-4c0b-bcac-b4269faf605a-kube-api-access-2pjpm\") pod \"cert-manager-webhook-5655c58dd6-7jwb7\" (UID: \"d42519d9-b7e8-4c0b-bcac-b4269faf605a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.821475 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77tjv\" (UniqueName: \"kubernetes.io/projected/6bc5bb21-6b5f-4b06-a96a-2e5883752c9a-kube-api-access-77tjv\") pod \"cert-manager-5b446d88c5-wb55f\" (UID: \"6bc5bb21-6b5f-4b06-a96a-2e5883752c9a\") " pod="cert-manager/cert-manager-5b446d88c5-wb55f" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.827573 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r56nk\" (UniqueName: \"kubernetes.io/projected/c09475fb-946d-45c4-8482-4db508ae7459-kube-api-access-r56nk\") pod \"cert-manager-cainjector-7f985d654d-rwllx\" (UID: \"c09475fb-946d-45c4-8482-4db508ae7459\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-rwllx" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.859453 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-rwllx" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.869712 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wb55f" Dec 03 14:26:33 crc kubenswrapper[4751]: I1203 14:26:33.882810 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" Dec 03 14:26:34 crc kubenswrapper[4751]: I1203 14:26:34.071031 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-rwllx"] Dec 03 14:26:34 crc kubenswrapper[4751]: I1203 14:26:34.134712 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-hq59z" Dec 03 14:26:34 crc kubenswrapper[4751]: I1203 14:26:34.333345 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wb55f"] Dec 03 14:26:34 crc kubenswrapper[4751]: I1203 14:26:34.346086 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-7jwb7"] Dec 03 14:26:34 crc kubenswrapper[4751]: W1203 14:26:34.349378 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd42519d9_b7e8_4c0b_bcac_b4269faf605a.slice/crio-9fc5b0c9da431128aeecc193693f3e6e449bfa8aaf8d0c170605f2aeca41a978 WatchSource:0}: Error finding container 9fc5b0c9da431128aeecc193693f3e6e449bfa8aaf8d0c170605f2aeca41a978: Status 404 returned error can't find the container with id 9fc5b0c9da431128aeecc193693f3e6e449bfa8aaf8d0c170605f2aeca41a978 Dec 03 14:26:34 crc kubenswrapper[4751]: I1203 14:26:34.448580 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-rwllx" event={"ID":"c09475fb-946d-45c4-8482-4db508ae7459","Type":"ContainerStarted","Data":"5da3728d9c88d4ee7396c92b41cd4310b320879d0597b53cbc07bb96f219446a"} Dec 03 14:26:34 crc kubenswrapper[4751]: I1203 14:26:34.449779 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" event={"ID":"d42519d9-b7e8-4c0b-bcac-b4269faf605a","Type":"ContainerStarted","Data":"9fc5b0c9da431128aeecc193693f3e6e449bfa8aaf8d0c170605f2aeca41a978"} Dec 03 14:26:34 crc kubenswrapper[4751]: I1203 14:26:34.450599 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wb55f" event={"ID":"6bc5bb21-6b5f-4b06-a96a-2e5883752c9a","Type":"ContainerStarted","Data":"88c1c39e070e36dd0f5c8c186e479e521bb85cf7544a1b1b3566e0a73f3e33bb"} Dec 03 14:26:35 crc kubenswrapper[4751]: I1203 14:26:35.820522 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:26:35 crc kubenswrapper[4751]: I1203 14:26:35.820587 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:26:36 crc kubenswrapper[4751]: I1203 14:26:36.797941 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:36 crc kubenswrapper[4751]: I1203 14:26:36.798000 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:36 crc kubenswrapper[4751]: I1203 14:26:36.845396 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:37 crc kubenswrapper[4751]: I1203 14:26:37.523916 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:37 crc kubenswrapper[4751]: I1203 14:26:37.573914 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86mbc"] Dec 03 14:26:39 crc kubenswrapper[4751]: I1203 14:26:39.484010 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-86mbc" podUID="d36638a2-eb05-47f5-87a6-db726a98e244" containerName="registry-server" containerID="cri-o://367694b5df47668d5d0ad534cbdbdfb195f6f50b28fe7856e9adf4972e823c07" gracePeriod=2 Dec 03 14:26:42 crc kubenswrapper[4751]: I1203 14:26:42.504392 4751 generic.go:334] "Generic (PLEG): container finished" podID="d36638a2-eb05-47f5-87a6-db726a98e244" containerID="367694b5df47668d5d0ad534cbdbdfb195f6f50b28fe7856e9adf4972e823c07" exitCode=0 Dec 03 14:26:42 crc kubenswrapper[4751]: I1203 14:26:42.504485 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mbc" event={"ID":"d36638a2-eb05-47f5-87a6-db726a98e244","Type":"ContainerDied","Data":"367694b5df47668d5d0ad534cbdbdfb195f6f50b28fe7856e9adf4972e823c07"} Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.713943 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.840145 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-catalog-content\") pod \"d36638a2-eb05-47f5-87a6-db726a98e244\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.840235 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpcmd\" (UniqueName: \"kubernetes.io/projected/d36638a2-eb05-47f5-87a6-db726a98e244-kube-api-access-kpcmd\") pod \"d36638a2-eb05-47f5-87a6-db726a98e244\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.840270 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-utilities\") pod \"d36638a2-eb05-47f5-87a6-db726a98e244\" (UID: \"d36638a2-eb05-47f5-87a6-db726a98e244\") " Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.841591 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-utilities" (OuterVolumeSpecName: "utilities") pod "d36638a2-eb05-47f5-87a6-db726a98e244" (UID: "d36638a2-eb05-47f5-87a6-db726a98e244"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.858583 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36638a2-eb05-47f5-87a6-db726a98e244-kube-api-access-kpcmd" (OuterVolumeSpecName: "kube-api-access-kpcmd") pod "d36638a2-eb05-47f5-87a6-db726a98e244" (UID: "d36638a2-eb05-47f5-87a6-db726a98e244"). InnerVolumeSpecName "kube-api-access-kpcmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.923044 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d36638a2-eb05-47f5-87a6-db726a98e244" (UID: "d36638a2-eb05-47f5-87a6-db726a98e244"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.941666 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.941721 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpcmd\" (UniqueName: \"kubernetes.io/projected/d36638a2-eb05-47f5-87a6-db726a98e244-kube-api-access-kpcmd\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:44 crc kubenswrapper[4751]: I1203 14:26:44.941736 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36638a2-eb05-47f5-87a6-db726a98e244-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.531553 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-rwllx" event={"ID":"c09475fb-946d-45c4-8482-4db508ae7459","Type":"ContainerStarted","Data":"640c9627631cb92a9094718399fb02d434a1f6e3ec4cba721fec84bf6f53162d"} Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.539001 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wb55f" event={"ID":"6bc5bb21-6b5f-4b06-a96a-2e5883752c9a","Type":"ContainerStarted","Data":"60c3b7babb7cbb93959298484a3f97043d62ac5ae518c5f90f978c0120213d54"} Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.540185 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" event={"ID":"d42519d9-b7e8-4c0b-bcac-b4269faf605a","Type":"ContainerStarted","Data":"6ff3069899f77a54bc53f9ddbaf9d9a7e8840dc30e7889867c20251032d26673"} Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.540782 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.543408 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86mbc" event={"ID":"d36638a2-eb05-47f5-87a6-db726a98e244","Type":"ContainerDied","Data":"0226a3135861fe4211f481f8fe4a0ae73aa1a140dfe2ba6bd48d72bbaa33dac8"} Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.543481 4751 scope.go:117] "RemoveContainer" containerID="367694b5df47668d5d0ad534cbdbdfb195f6f50b28fe7856e9adf4972e823c07" Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.543536 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86mbc" Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.548040 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-rwllx" podStartSLOduration=1.6498369990000001 podStartE2EDuration="12.548021008s" podCreationTimestamp="2025-12-03 14:26:33 +0000 UTC" firstStartedPulling="2025-12-03 14:26:34.081637086 +0000 UTC m=+801.069992303" lastFinishedPulling="2025-12-03 14:26:44.979821085 +0000 UTC m=+811.968176312" observedRunningTime="2025-12-03 14:26:45.544573295 +0000 UTC m=+812.532928512" watchObservedRunningTime="2025-12-03 14:26:45.548021008 +0000 UTC m=+812.536376245" Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.573180 4751 scope.go:117] "RemoveContainer" containerID="e960ee3922c928d7a236f36692dfb754b0a0926cef03372e9e26e09de359b4b4" Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.582751 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" podStartSLOduration=1.94278643 podStartE2EDuration="12.581779863s" podCreationTimestamp="2025-12-03 14:26:33 +0000 UTC" firstStartedPulling="2025-12-03 14:26:34.351591134 +0000 UTC m=+801.339946341" lastFinishedPulling="2025-12-03 14:26:44.990584557 +0000 UTC m=+811.978939774" observedRunningTime="2025-12-03 14:26:45.579768229 +0000 UTC m=+812.568123446" watchObservedRunningTime="2025-12-03 14:26:45.581779863 +0000 UTC m=+812.570135080" Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.607170 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-wb55f" podStartSLOduration=1.957337514 podStartE2EDuration="12.607152171s" podCreationTimestamp="2025-12-03 14:26:33 +0000 UTC" firstStartedPulling="2025-12-03 14:26:34.34040964 +0000 UTC m=+801.328764857" lastFinishedPulling="2025-12-03 14:26:44.990224297 +0000 UTC m=+811.978579514" observedRunningTime="2025-12-03 14:26:45.603481331 +0000 UTC m=+812.591836558" watchObservedRunningTime="2025-12-03 14:26:45.607152171 +0000 UTC m=+812.595507388" Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.622762 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86mbc"] Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.632919 4751 scope.go:117] "RemoveContainer" containerID="d3b69891c130a10f3799b27b79605c223a0b4d0426a3a70e35fea0a1d1d7b630" Dec 03 14:26:45 crc kubenswrapper[4751]: I1203 14:26:45.634831 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-86mbc"] Dec 03 14:26:47 crc kubenswrapper[4751]: I1203 14:26:47.323755 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36638a2-eb05-47f5-87a6-db726a98e244" path="/var/lib/kubelet/pods/d36638a2-eb05-47f5-87a6-db726a98e244/volumes" Dec 03 14:26:53 crc kubenswrapper[4751]: I1203 14:26:53.886207 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-7jwb7" Dec 03 14:27:05 crc kubenswrapper[4751]: I1203 14:27:05.820170 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:27:05 crc kubenswrapper[4751]: I1203 14:27:05.821002 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.474926 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6"] Dec 03 14:27:18 crc kubenswrapper[4751]: E1203 14:27:18.475755 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36638a2-eb05-47f5-87a6-db726a98e244" containerName="extract-content" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.475771 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36638a2-eb05-47f5-87a6-db726a98e244" containerName="extract-content" Dec 03 14:27:18 crc kubenswrapper[4751]: E1203 14:27:18.475789 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36638a2-eb05-47f5-87a6-db726a98e244" containerName="extract-utilities" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.475796 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36638a2-eb05-47f5-87a6-db726a98e244" containerName="extract-utilities" Dec 03 14:27:18 crc kubenswrapper[4751]: E1203 14:27:18.475805 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36638a2-eb05-47f5-87a6-db726a98e244" containerName="registry-server" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.475813 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36638a2-eb05-47f5-87a6-db726a98e244" containerName="registry-server" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.475926 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36638a2-eb05-47f5-87a6-db726a98e244" containerName="registry-server" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.476851 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.486971 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.491174 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6"] Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.598801 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v99g7\" (UniqueName: \"kubernetes.io/projected/5be1a950-9285-46c8-af53-976abeddd5fb-kube-api-access-v99g7\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.598853 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.598892 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.700534 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v99g7\" (UniqueName: \"kubernetes.io/projected/5be1a950-9285-46c8-af53-976abeddd5fb-kube-api-access-v99g7\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.700623 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.700691 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.701450 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.701566 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.725935 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v99g7\" (UniqueName: \"kubernetes.io/projected/5be1a950-9285-46c8-af53-976abeddd5fb-kube-api-access-v99g7\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.792295 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:18 crc kubenswrapper[4751]: I1203 14:27:18.993813 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6"] Dec 03 14:27:19 crc kubenswrapper[4751]: W1203 14:27:19.002514 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be1a950_9285_46c8_af53_976abeddd5fb.slice/crio-ea805edcaf3d3be03e41d5cb7dc79bec7b21ceac4b624cdba5481ba790ee0ab8 WatchSource:0}: Error finding container ea805edcaf3d3be03e41d5cb7dc79bec7b21ceac4b624cdba5481ba790ee0ab8: Status 404 returned error can't find the container with id ea805edcaf3d3be03e41d5cb7dc79bec7b21ceac4b624cdba5481ba790ee0ab8 Dec 03 14:27:19 crc kubenswrapper[4751]: I1203 14:27:19.740212 4751 generic.go:334] "Generic (PLEG): container finished" podID="5be1a950-9285-46c8-af53-976abeddd5fb" containerID="065913d493aecca189000e76ba624bbc35f3bc7d49e0147a762d60e83684491a" exitCode=0 Dec 03 14:27:19 crc kubenswrapper[4751]: I1203 14:27:19.740373 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" event={"ID":"5be1a950-9285-46c8-af53-976abeddd5fb","Type":"ContainerDied","Data":"065913d493aecca189000e76ba624bbc35f3bc7d49e0147a762d60e83684491a"} Dec 03 14:27:19 crc kubenswrapper[4751]: I1203 14:27:19.740538 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" event={"ID":"5be1a950-9285-46c8-af53-976abeddd5fb","Type":"ContainerStarted","Data":"ea805edcaf3d3be03e41d5cb7dc79bec7b21ceac4b624cdba5481ba790ee0ab8"} Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.683683 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6z6ch"] Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.684807 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.696769 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z6ch"] Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.728414 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-catalog-content\") pod \"redhat-operators-6z6ch\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.728521 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcxn9\" (UniqueName: \"kubernetes.io/projected/400c3408-3589-452a-ab78-5442f2e38ac5-kube-api-access-lcxn9\") pod \"redhat-operators-6z6ch\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.728553 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-utilities\") pod \"redhat-operators-6z6ch\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.830285 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcxn9\" (UniqueName: \"kubernetes.io/projected/400c3408-3589-452a-ab78-5442f2e38ac5-kube-api-access-lcxn9\") pod \"redhat-operators-6z6ch\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.830343 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-utilities\") pod \"redhat-operators-6z6ch\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.830412 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-catalog-content\") pod \"redhat-operators-6z6ch\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.830867 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-catalog-content\") pod \"redhat-operators-6z6ch\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.830898 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-utilities\") pod \"redhat-operators-6z6ch\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:20 crc kubenswrapper[4751]: I1203 14:27:20.853384 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcxn9\" (UniqueName: \"kubernetes.io/projected/400c3408-3589-452a-ab78-5442f2e38ac5-kube-api-access-lcxn9\") pod \"redhat-operators-6z6ch\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.003784 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.427221 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z6ch"] Dec 03 14:27:21 crc kubenswrapper[4751]: W1203 14:27:21.432544 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400c3408_3589_452a_ab78_5442f2e38ac5.slice/crio-d64764b534e1b34f04c08d727b76d81e3307c2049dd46169588474dff748e970 WatchSource:0}: Error finding container d64764b534e1b34f04c08d727b76d81e3307c2049dd46169588474dff748e970: Status 404 returned error can't find the container with id d64764b534e1b34f04c08d727b76d81e3307c2049dd46169588474dff748e970 Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.704020 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.704964 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.706392 4751 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-jbtqg" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.707312 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.707336 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.712773 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.746860 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5tm8\" (UniqueName: \"kubernetes.io/projected/45576c1d-e6ef-47c0-a326-819f06001b8b-kube-api-access-f5tm8\") pod \"minio\" (UID: \"45576c1d-e6ef-47c0-a326-819f06001b8b\") " pod="minio-dev/minio" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.746896 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0b249f78-f921-4f05-a06c-be52b5c30d04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b249f78-f921-4f05-a06c-be52b5c30d04\") pod \"minio\" (UID: \"45576c1d-e6ef-47c0-a326-819f06001b8b\") " pod="minio-dev/minio" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.751118 4751 generic.go:334] "Generic (PLEG): container finished" podID="400c3408-3589-452a-ab78-5442f2e38ac5" containerID="3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce" exitCode=0 Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.751181 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z6ch" event={"ID":"400c3408-3589-452a-ab78-5442f2e38ac5","Type":"ContainerDied","Data":"3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce"} Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.751215 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z6ch" event={"ID":"400c3408-3589-452a-ab78-5442f2e38ac5","Type":"ContainerStarted","Data":"d64764b534e1b34f04c08d727b76d81e3307c2049dd46169588474dff748e970"} Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.752574 4751 generic.go:334] "Generic (PLEG): container finished" podID="5be1a950-9285-46c8-af53-976abeddd5fb" containerID="a3ce94c2092772f7c3104fed3aefba4a3465a9a7cfbf248ca76c1435b66dd736" exitCode=0 Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.752606 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" event={"ID":"5be1a950-9285-46c8-af53-976abeddd5fb","Type":"ContainerDied","Data":"a3ce94c2092772f7c3104fed3aefba4a3465a9a7cfbf248ca76c1435b66dd736"} Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.848066 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5tm8\" (UniqueName: \"kubernetes.io/projected/45576c1d-e6ef-47c0-a326-819f06001b8b-kube-api-access-f5tm8\") pod \"minio\" (UID: \"45576c1d-e6ef-47c0-a326-819f06001b8b\") " pod="minio-dev/minio" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.848123 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0b249f78-f921-4f05-a06c-be52b5c30d04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b249f78-f921-4f05-a06c-be52b5c30d04\") pod \"minio\" (UID: \"45576c1d-e6ef-47c0-a326-819f06001b8b\") " pod="minio-dev/minio" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.852173 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.852213 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0b249f78-f921-4f05-a06c-be52b5c30d04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b249f78-f921-4f05-a06c-be52b5c30d04\") pod \"minio\" (UID: \"45576c1d-e6ef-47c0-a326-819f06001b8b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1597062e56a23d8bb28f20a4f9ec4ce8594ed649864138942cf0e6bcecff267d/globalmount\"" pod="minio-dev/minio" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.870202 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5tm8\" (UniqueName: \"kubernetes.io/projected/45576c1d-e6ef-47c0-a326-819f06001b8b-kube-api-access-f5tm8\") pod \"minio\" (UID: \"45576c1d-e6ef-47c0-a326-819f06001b8b\") " pod="minio-dev/minio" Dec 03 14:27:21 crc kubenswrapper[4751]: I1203 14:27:21.892413 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0b249f78-f921-4f05-a06c-be52b5c30d04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b249f78-f921-4f05-a06c-be52b5c30d04\") pod \"minio\" (UID: \"45576c1d-e6ef-47c0-a326-819f06001b8b\") " pod="minio-dev/minio" Dec 03 14:27:22 crc kubenswrapper[4751]: I1203 14:27:22.018796 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 03 14:27:22 crc kubenswrapper[4751]: I1203 14:27:22.184691 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 03 14:27:22 crc kubenswrapper[4751]: W1203 14:27:22.190084 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45576c1d_e6ef_47c0_a326_819f06001b8b.slice/crio-f5d7f597258390f1aa05ef42cf3ee737ec22401a55b1b214f2593c64ed9e84f6 WatchSource:0}: Error finding container f5d7f597258390f1aa05ef42cf3ee737ec22401a55b1b214f2593c64ed9e84f6: Status 404 returned error can't find the container with id f5d7f597258390f1aa05ef42cf3ee737ec22401a55b1b214f2593c64ed9e84f6 Dec 03 14:27:22 crc kubenswrapper[4751]: I1203 14:27:22.807601 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"45576c1d-e6ef-47c0-a326-819f06001b8b","Type":"ContainerStarted","Data":"f5d7f597258390f1aa05ef42cf3ee737ec22401a55b1b214f2593c64ed9e84f6"} Dec 03 14:27:22 crc kubenswrapper[4751]: I1203 14:27:22.834953 4751 generic.go:334] "Generic (PLEG): container finished" podID="5be1a950-9285-46c8-af53-976abeddd5fb" containerID="f7258921c9f14c8f0ee9a24fa1ed77c78a2eda76c078f682765a78128c311e34" exitCode=0 Dec 03 14:27:22 crc kubenswrapper[4751]: I1203 14:27:22.834991 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" event={"ID":"5be1a950-9285-46c8-af53-976abeddd5fb","Type":"ContainerDied","Data":"f7258921c9f14c8f0ee9a24fa1ed77c78a2eda76c078f682765a78128c311e34"} Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.629944 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.690188 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-util\") pod \"5be1a950-9285-46c8-af53-976abeddd5fb\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.690241 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-bundle\") pod \"5be1a950-9285-46c8-af53-976abeddd5fb\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.690297 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v99g7\" (UniqueName: \"kubernetes.io/projected/5be1a950-9285-46c8-af53-976abeddd5fb-kube-api-access-v99g7\") pod \"5be1a950-9285-46c8-af53-976abeddd5fb\" (UID: \"5be1a950-9285-46c8-af53-976abeddd5fb\") " Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.692786 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-bundle" (OuterVolumeSpecName: "bundle") pod "5be1a950-9285-46c8-af53-976abeddd5fb" (UID: "5be1a950-9285-46c8-af53-976abeddd5fb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.704435 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be1a950-9285-46c8-af53-976abeddd5fb-kube-api-access-v99g7" (OuterVolumeSpecName: "kube-api-access-v99g7") pod "5be1a950-9285-46c8-af53-976abeddd5fb" (UID: "5be1a950-9285-46c8-af53-976abeddd5fb"). InnerVolumeSpecName "kube-api-access-v99g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.706821 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-util" (OuterVolumeSpecName: "util") pod "5be1a950-9285-46c8-af53-976abeddd5fb" (UID: "5be1a950-9285-46c8-af53-976abeddd5fb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.792103 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-util\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.792143 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5be1a950-9285-46c8-af53-976abeddd5fb-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.792152 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v99g7\" (UniqueName: \"kubernetes.io/projected/5be1a950-9285-46c8-af53-976abeddd5fb-kube-api-access-v99g7\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.847336 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.847237 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6" event={"ID":"5be1a950-9285-46c8-af53-976abeddd5fb","Type":"ContainerDied","Data":"ea805edcaf3d3be03e41d5cb7dc79bec7b21ceac4b624cdba5481ba790ee0ab8"} Dec 03 14:27:24 crc kubenswrapper[4751]: I1203 14:27:24.849443 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea805edcaf3d3be03e41d5cb7dc79bec7b21ceac4b624cdba5481ba790ee0ab8" Dec 03 14:27:27 crc kubenswrapper[4751]: I1203 14:27:27.880086 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z6ch" event={"ID":"400c3408-3589-452a-ab78-5442f2e38ac5","Type":"ContainerStarted","Data":"074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3"} Dec 03 14:27:28 crc kubenswrapper[4751]: I1203 14:27:28.888455 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"45576c1d-e6ef-47c0-a326-819f06001b8b","Type":"ContainerStarted","Data":"4b642f52d717bda7e3056114cfbb24abccd3427f7af553fab59d8cfaf0972e95"} Dec 03 14:27:28 crc kubenswrapper[4751]: I1203 14:27:28.891395 4751 generic.go:334] "Generic (PLEG): container finished" podID="400c3408-3589-452a-ab78-5442f2e38ac5" containerID="074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3" exitCode=0 Dec 03 14:27:28 crc kubenswrapper[4751]: I1203 14:27:28.891453 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z6ch" event={"ID":"400c3408-3589-452a-ab78-5442f2e38ac5","Type":"ContainerDied","Data":"074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3"} Dec 03 14:27:28 crc kubenswrapper[4751]: I1203 14:27:28.910982 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.8777799 podStartE2EDuration="9.9109574s" podCreationTimestamp="2025-12-03 14:27:19 +0000 UTC" firstStartedPulling="2025-12-03 14:27:22.19155371 +0000 UTC m=+849.179908927" lastFinishedPulling="2025-12-03 14:27:27.22473121 +0000 UTC m=+854.213086427" observedRunningTime="2025-12-03 14:27:28.905612395 +0000 UTC m=+855.893967653" watchObservedRunningTime="2025-12-03 14:27:28.9109574 +0000 UTC m=+855.899312637" Dec 03 14:27:29 crc kubenswrapper[4751]: I1203 14:27:29.900958 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z6ch" event={"ID":"400c3408-3589-452a-ab78-5442f2e38ac5","Type":"ContainerStarted","Data":"0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e"} Dec 03 14:27:29 crc kubenswrapper[4751]: I1203 14:27:29.924509 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6z6ch" podStartSLOduration=2.317936007 podStartE2EDuration="9.924490146s" podCreationTimestamp="2025-12-03 14:27:20 +0000 UTC" firstStartedPulling="2025-12-03 14:27:21.752226481 +0000 UTC m=+848.740581688" lastFinishedPulling="2025-12-03 14:27:29.35878061 +0000 UTC m=+856.347135827" observedRunningTime="2025-12-03 14:27:29.919353717 +0000 UTC m=+856.907708944" watchObservedRunningTime="2025-12-03 14:27:29.924490146 +0000 UTC m=+856.912845363" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.004509 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.004564 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.564251 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct"] Dec 03 14:27:31 crc kubenswrapper[4751]: E1203 14:27:31.564496 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be1a950-9285-46c8-af53-976abeddd5fb" containerName="util" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.564507 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be1a950-9285-46c8-af53-976abeddd5fb" containerName="util" Dec 03 14:27:31 crc kubenswrapper[4751]: E1203 14:27:31.564518 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be1a950-9285-46c8-af53-976abeddd5fb" containerName="extract" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.564524 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be1a950-9285-46c8-af53-976abeddd5fb" containerName="extract" Dec 03 14:27:31 crc kubenswrapper[4751]: E1203 14:27:31.564532 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be1a950-9285-46c8-af53-976abeddd5fb" containerName="pull" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.564539 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be1a950-9285-46c8-af53-976abeddd5fb" containerName="pull" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.564632 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be1a950-9285-46c8-af53-976abeddd5fb" containerName="extract" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.565350 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.567300 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.576336 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct"] Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.584296 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.584398 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.584422 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlvpj\" (UniqueName: \"kubernetes.io/projected/9a93d622-3d27-473f-92bb-ffe7b9ec4239-kube-api-access-mlvpj\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.685678 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.685743 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlvpj\" (UniqueName: \"kubernetes.io/projected/9a93d622-3d27-473f-92bb-ffe7b9ec4239-kube-api-access-mlvpj\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.685820 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.686391 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.686686 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.707574 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlvpj\" (UniqueName: \"kubernetes.io/projected/9a93d622-3d27-473f-92bb-ffe7b9ec4239-kube-api-access-mlvpj\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:31 crc kubenswrapper[4751]: I1203 14:27:31.881295 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:32 crc kubenswrapper[4751]: I1203 14:27:32.051541 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z6ch" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" containerName="registry-server" probeResult="failure" output=< Dec 03 14:27:32 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 14:27:32 crc kubenswrapper[4751]: > Dec 03 14:27:32 crc kubenswrapper[4751]: I1203 14:27:32.134745 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct"] Dec 03 14:27:32 crc kubenswrapper[4751]: W1203 14:27:32.141134 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a93d622_3d27_473f_92bb_ffe7b9ec4239.slice/crio-29133ff0931d2ff82bcd2160c24d5ce39c92d4c1bffd4563c470632fd9ae41cd WatchSource:0}: Error finding container 29133ff0931d2ff82bcd2160c24d5ce39c92d4c1bffd4563c470632fd9ae41cd: Status 404 returned error can't find the container with id 29133ff0931d2ff82bcd2160c24d5ce39c92d4c1bffd4563c470632fd9ae41cd Dec 03 14:27:32 crc kubenswrapper[4751]: I1203 14:27:32.918170 4751 generic.go:334] "Generic (PLEG): container finished" podID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerID="a59226f4cfe9ad7c3d5c2df3d17bdf23d455d31e5fe66e485fc0e6b1f097843d" exitCode=0 Dec 03 14:27:32 crc kubenswrapper[4751]: I1203 14:27:32.918394 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" event={"ID":"9a93d622-3d27-473f-92bb-ffe7b9ec4239","Type":"ContainerDied","Data":"a59226f4cfe9ad7c3d5c2df3d17bdf23d455d31e5fe66e485fc0e6b1f097843d"} Dec 03 14:27:32 crc kubenswrapper[4751]: I1203 14:27:32.918488 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" event={"ID":"9a93d622-3d27-473f-92bb-ffe7b9ec4239","Type":"ContainerStarted","Data":"29133ff0931d2ff82bcd2160c24d5ce39c92d4c1bffd4563c470632fd9ae41cd"} Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.127581 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr"] Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.128475 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.132035 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.132073 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.134631 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-rg6s2" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.135460 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.136742 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.137264 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.158009 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr"] Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.207391 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-apiservice-cert\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.207487 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-webhook-cert\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.207521 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td7ml\" (UniqueName: \"kubernetes.io/projected/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-kube-api-access-td7ml\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.207550 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.207853 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-manager-config\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.308801 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-manager-config\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.308866 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-apiservice-cert\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.308893 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-webhook-cert\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.308914 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td7ml\" (UniqueName: \"kubernetes.io/projected/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-kube-api-access-td7ml\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.308932 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.310281 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-manager-config\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.314126 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-apiservice-cert\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.314176 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-webhook-cert\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.319506 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.334270 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td7ml\" (UniqueName: \"kubernetes.io/projected/b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238-kube-api-access-td7ml\") pod \"loki-operator-controller-manager-766794d8b8-zzghr\" (UID: \"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238\") " pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.444536 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.654831 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr"] Dec 03 14:27:33 crc kubenswrapper[4751]: W1203 14:27:33.661485 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb44b61b1_5d61_4f3c_b4bc_a1a1d9c44238.slice/crio-240c95f5b0e061c2cdb2c651891e7d3260dc3c7af2236bb2a4de129f49b5a194 WatchSource:0}: Error finding container 240c95f5b0e061c2cdb2c651891e7d3260dc3c7af2236bb2a4de129f49b5a194: Status 404 returned error can't find the container with id 240c95f5b0e061c2cdb2c651891e7d3260dc3c7af2236bb2a4de129f49b5a194 Dec 03 14:27:33 crc kubenswrapper[4751]: I1203 14:27:33.924596 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" event={"ID":"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238","Type":"ContainerStarted","Data":"240c95f5b0e061c2cdb2c651891e7d3260dc3c7af2236bb2a4de129f49b5a194"} Dec 03 14:27:35 crc kubenswrapper[4751]: I1203 14:27:35.820763 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:27:35 crc kubenswrapper[4751]: I1203 14:27:35.821304 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:27:35 crc kubenswrapper[4751]: I1203 14:27:35.821384 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:27:35 crc kubenswrapper[4751]: I1203 14:27:35.822068 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"022494d9f3dab8e8955cedfdb1fecb645926d841488965605deafc394884f056"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:27:35 crc kubenswrapper[4751]: I1203 14:27:35.822133 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://022494d9f3dab8e8955cedfdb1fecb645926d841488965605deafc394884f056" gracePeriod=600 Dec 03 14:27:35 crc kubenswrapper[4751]: I1203 14:27:35.941038 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" event={"ID":"9a93d622-3d27-473f-92bb-ffe7b9ec4239","Type":"ContainerStarted","Data":"93e9514e5bf206c48837c13689741eb2a80f6e5e1d915f19758e9325a8ff08b4"} Dec 03 14:27:36 crc kubenswrapper[4751]: I1203 14:27:36.949609 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="022494d9f3dab8e8955cedfdb1fecb645926d841488965605deafc394884f056" exitCode=0 Dec 03 14:27:36 crc kubenswrapper[4751]: I1203 14:27:36.950772 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"022494d9f3dab8e8955cedfdb1fecb645926d841488965605deafc394884f056"} Dec 03 14:27:36 crc kubenswrapper[4751]: I1203 14:27:36.950820 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"6554aa9d5e7898bf5e07fe04c4800b61a41046f1b56d94f87ff1d09b45063fa3"} Dec 03 14:27:36 crc kubenswrapper[4751]: I1203 14:27:36.950839 4751 scope.go:117] "RemoveContainer" containerID="4b56cbf4f0b4c2218f219ed756acd543426f8d25cb023bb8168954d5d1f9f3a6" Dec 03 14:27:36 crc kubenswrapper[4751]: I1203 14:27:36.960074 4751 generic.go:334] "Generic (PLEG): container finished" podID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerID="93e9514e5bf206c48837c13689741eb2a80f6e5e1d915f19758e9325a8ff08b4" exitCode=0 Dec 03 14:27:36 crc kubenswrapper[4751]: I1203 14:27:36.960111 4751 generic.go:334] "Generic (PLEG): container finished" podID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerID="c98c105ea750eba248271a58a303822445417d04fbbfa37df318bc4bd5322e97" exitCode=0 Dec 03 14:27:36 crc kubenswrapper[4751]: I1203 14:27:36.960132 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" event={"ID":"9a93d622-3d27-473f-92bb-ffe7b9ec4239","Type":"ContainerDied","Data":"93e9514e5bf206c48837c13689741eb2a80f6e5e1d915f19758e9325a8ff08b4"} Dec 03 14:27:36 crc kubenswrapper[4751]: I1203 14:27:36.960158 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" event={"ID":"9a93d622-3d27-473f-92bb-ffe7b9ec4239","Type":"ContainerDied","Data":"c98c105ea750eba248271a58a303822445417d04fbbfa37df318bc4bd5322e97"} Dec 03 14:27:41 crc kubenswrapper[4751]: I1203 14:27:41.074735 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:41 crc kubenswrapper[4751]: I1203 14:27:41.138074 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:41 crc kubenswrapper[4751]: I1203 14:27:41.306710 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z6ch"] Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.511694 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.571684 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-bundle\") pod \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.571785 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlvpj\" (UniqueName: \"kubernetes.io/projected/9a93d622-3d27-473f-92bb-ffe7b9ec4239-kube-api-access-mlvpj\") pod \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.571946 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-util\") pod \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\" (UID: \"9a93d622-3d27-473f-92bb-ffe7b9ec4239\") " Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.573127 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-bundle" (OuterVolumeSpecName: "bundle") pod "9a93d622-3d27-473f-92bb-ffe7b9ec4239" (UID: "9a93d622-3d27-473f-92bb-ffe7b9ec4239"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.576818 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a93d622-3d27-473f-92bb-ffe7b9ec4239-kube-api-access-mlvpj" (OuterVolumeSpecName: "kube-api-access-mlvpj") pod "9a93d622-3d27-473f-92bb-ffe7b9ec4239" (UID: "9a93d622-3d27-473f-92bb-ffe7b9ec4239"). InnerVolumeSpecName "kube-api-access-mlvpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.582147 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-util" (OuterVolumeSpecName: "util") pod "9a93d622-3d27-473f-92bb-ffe7b9ec4239" (UID: "9a93d622-3d27-473f-92bb-ffe7b9ec4239"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.677752 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-util\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.678121 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a93d622-3d27-473f-92bb-ffe7b9ec4239-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:42 crc kubenswrapper[4751]: I1203 14:27:42.678136 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlvpj\" (UniqueName: \"kubernetes.io/projected/9a93d622-3d27-473f-92bb-ffe7b9ec4239-kube-api-access-mlvpj\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:42.999989 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" event={"ID":"9a93d622-3d27-473f-92bb-ffe7b9ec4239","Type":"ContainerDied","Data":"29133ff0931d2ff82bcd2160c24d5ce39c92d4c1bffd4563c470632fd9ae41cd"} Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.000033 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29133ff0931d2ff82bcd2160c24d5ce39c92d4c1bffd4563c470632fd9ae41cd" Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.000070 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct" Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.001691 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" event={"ID":"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238","Type":"ContainerStarted","Data":"f6815262169e14760e296fcb3eaf51453f58179934c5146aaa334d4deb685928"} Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.001888 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6z6ch" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" containerName="registry-server" containerID="cri-o://0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e" gracePeriod=2 Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.305048 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.385754 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcxn9\" (UniqueName: \"kubernetes.io/projected/400c3408-3589-452a-ab78-5442f2e38ac5-kube-api-access-lcxn9\") pod \"400c3408-3589-452a-ab78-5442f2e38ac5\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.385864 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-utilities\") pod \"400c3408-3589-452a-ab78-5442f2e38ac5\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.385922 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-catalog-content\") pod \"400c3408-3589-452a-ab78-5442f2e38ac5\" (UID: \"400c3408-3589-452a-ab78-5442f2e38ac5\") " Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.386839 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-utilities" (OuterVolumeSpecName: "utilities") pod "400c3408-3589-452a-ab78-5442f2e38ac5" (UID: "400c3408-3589-452a-ab78-5442f2e38ac5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.388125 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.391346 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/400c3408-3589-452a-ab78-5442f2e38ac5-kube-api-access-lcxn9" (OuterVolumeSpecName: "kube-api-access-lcxn9") pod "400c3408-3589-452a-ab78-5442f2e38ac5" (UID: "400c3408-3589-452a-ab78-5442f2e38ac5"). InnerVolumeSpecName "kube-api-access-lcxn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.490211 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcxn9\" (UniqueName: \"kubernetes.io/projected/400c3408-3589-452a-ab78-5442f2e38ac5-kube-api-access-lcxn9\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.504164 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "400c3408-3589-452a-ab78-5442f2e38ac5" (UID: "400c3408-3589-452a-ab78-5442f2e38ac5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:43 crc kubenswrapper[4751]: I1203 14:27:43.592551 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/400c3408-3589-452a-ab78-5442f2e38ac5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.010544 4751 generic.go:334] "Generic (PLEG): container finished" podID="400c3408-3589-452a-ab78-5442f2e38ac5" containerID="0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e" exitCode=0 Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.010593 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z6ch" event={"ID":"400c3408-3589-452a-ab78-5442f2e38ac5","Type":"ContainerDied","Data":"0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e"} Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.010629 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z6ch" event={"ID":"400c3408-3589-452a-ab78-5442f2e38ac5","Type":"ContainerDied","Data":"d64764b534e1b34f04c08d727b76d81e3307c2049dd46169588474dff748e970"} Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.010653 4751 scope.go:117] "RemoveContainer" containerID="0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.010666 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z6ch" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.028348 4751 scope.go:117] "RemoveContainer" containerID="074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.042427 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z6ch"] Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.047948 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6z6ch"] Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.051775 4751 scope.go:117] "RemoveContainer" containerID="3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.074638 4751 scope.go:117] "RemoveContainer" containerID="0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e" Dec 03 14:27:44 crc kubenswrapper[4751]: E1203 14:27:44.075112 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e\": container with ID starting with 0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e not found: ID does not exist" containerID="0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.075148 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e"} err="failed to get container status \"0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e\": rpc error: code = NotFound desc = could not find container \"0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e\": container with ID starting with 0cc3e0cc94ab8c6547f7d54bb90a3e24b09fe49598f6f28f3f6da56bbae43b4e not found: ID does not exist" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.075173 4751 scope.go:117] "RemoveContainer" containerID="074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3" Dec 03 14:27:44 crc kubenswrapper[4751]: E1203 14:27:44.076438 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3\": container with ID starting with 074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3 not found: ID does not exist" containerID="074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.076505 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3"} err="failed to get container status \"074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3\": rpc error: code = NotFound desc = could not find container \"074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3\": container with ID starting with 074748c57093c43a3854c5c8b5df8784f80bc3f626855937898a9102b45dbfb3 not found: ID does not exist" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.076526 4751 scope.go:117] "RemoveContainer" containerID="3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce" Dec 03 14:27:44 crc kubenswrapper[4751]: E1203 14:27:44.076878 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce\": container with ID starting with 3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce not found: ID does not exist" containerID="3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce" Dec 03 14:27:44 crc kubenswrapper[4751]: I1203 14:27:44.076908 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce"} err="failed to get container status \"3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce\": rpc error: code = NotFound desc = could not find container \"3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce\": container with ID starting with 3d7a088b0e95698c0d29866cc7348ca1482c81c53d0d573dbb4cdae8fb9495ce not found: ID does not exist" Dec 03 14:27:45 crc kubenswrapper[4751]: I1203 14:27:45.323571 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" path="/var/lib/kubelet/pods/400c3408-3589-452a-ab78-5442f2e38ac5/volumes" Dec 03 14:27:49 crc kubenswrapper[4751]: I1203 14:27:49.044534 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" event={"ID":"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238","Type":"ContainerStarted","Data":"43b3ba6908ca7a360d18fbe1d4c27a2fc873c6b660187c40dd69949241837455"} Dec 03 14:27:49 crc kubenswrapper[4751]: I1203 14:27:49.045119 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:49 crc kubenswrapper[4751]: I1203 14:27:49.047710 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:27:49 crc kubenswrapper[4751]: I1203 14:27:49.076879 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" podStartSLOduration=1.364450474 podStartE2EDuration="16.07684917s" podCreationTimestamp="2025-12-03 14:27:33 +0000 UTC" firstStartedPulling="2025-12-03 14:27:33.666917406 +0000 UTC m=+860.655272623" lastFinishedPulling="2025-12-03 14:27:48.379316102 +0000 UTC m=+875.367671319" observedRunningTime="2025-12-03 14:27:49.068582766 +0000 UTC m=+876.056938023" watchObservedRunningTime="2025-12-03 14:27:49.07684917 +0000 UTC m=+876.065204437" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.226472 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9"] Dec 03 14:28:19 crc kubenswrapper[4751]: E1203 14:28:19.227064 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerName="pull" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.227075 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerName="pull" Dec 03 14:28:19 crc kubenswrapper[4751]: E1203 14:28:19.227087 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerName="extract" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.227093 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerName="extract" Dec 03 14:28:19 crc kubenswrapper[4751]: E1203 14:28:19.227105 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" containerName="extract-content" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.227111 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" containerName="extract-content" Dec 03 14:28:19 crc kubenswrapper[4751]: E1203 14:28:19.227121 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" containerName="registry-server" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.227126 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" containerName="registry-server" Dec 03 14:28:19 crc kubenswrapper[4751]: E1203 14:28:19.227133 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerName="util" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.227138 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerName="util" Dec 03 14:28:19 crc kubenswrapper[4751]: E1203 14:28:19.227149 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" containerName="extract-utilities" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.227155 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" containerName="extract-utilities" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.227247 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a93d622-3d27-473f-92bb-ffe7b9ec4239" containerName="extract" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.227255 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="400c3408-3589-452a-ab78-5442f2e38ac5" containerName="registry-server" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.228195 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.231155 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.239510 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9"] Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.384221 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.384359 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.384380 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwdbb\" (UniqueName: \"kubernetes.io/projected/b30158de-69d4-4a93-9952-9b61fd08e5cd-kube-api-access-wwdbb\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.485300 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.485584 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.485614 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwdbb\" (UniqueName: \"kubernetes.io/projected/b30158de-69d4-4a93-9952-9b61fd08e5cd-kube-api-access-wwdbb\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.487192 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.487947 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.504767 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwdbb\" (UniqueName: \"kubernetes.io/projected/b30158de-69d4-4a93-9952-9b61fd08e5cd-kube-api-access-wwdbb\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.545444 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:19 crc kubenswrapper[4751]: I1203 14:28:19.732178 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9"] Dec 03 14:28:20 crc kubenswrapper[4751]: I1203 14:28:20.453536 4751 generic.go:334] "Generic (PLEG): container finished" podID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerID="a48cc24b0529a24c8021909611c668220773e209bdea733a0387f81da02280d2" exitCode=0 Dec 03 14:28:20 crc kubenswrapper[4751]: I1203 14:28:20.453587 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" event={"ID":"b30158de-69d4-4a93-9952-9b61fd08e5cd","Type":"ContainerDied","Data":"a48cc24b0529a24c8021909611c668220773e209bdea733a0387f81da02280d2"} Dec 03 14:28:20 crc kubenswrapper[4751]: I1203 14:28:20.453631 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" event={"ID":"b30158de-69d4-4a93-9952-9b61fd08e5cd","Type":"ContainerStarted","Data":"5aba0a62a9f08ad6f79ce80f3c605e5f8ac5fc645c6b293ef825aa9a3c6dff9a"} Dec 03 14:28:23 crc kubenswrapper[4751]: I1203 14:28:23.481397 4751 generic.go:334] "Generic (PLEG): container finished" podID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerID="5965fcb877361af6eb9504deb3c849e950a2f47cf671ac478f810bdaa9ddabfd" exitCode=0 Dec 03 14:28:23 crc kubenswrapper[4751]: I1203 14:28:23.481474 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" event={"ID":"b30158de-69d4-4a93-9952-9b61fd08e5cd","Type":"ContainerDied","Data":"5965fcb877361af6eb9504deb3c849e950a2f47cf671ac478f810bdaa9ddabfd"} Dec 03 14:28:24 crc kubenswrapper[4751]: I1203 14:28:24.498677 4751 generic.go:334] "Generic (PLEG): container finished" podID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerID="3a058ee96818f849314d27265a47c6c03285de78e0a25f9b799c331b83d881c0" exitCode=0 Dec 03 14:28:24 crc kubenswrapper[4751]: I1203 14:28:24.498728 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" event={"ID":"b30158de-69d4-4a93-9952-9b61fd08e5cd","Type":"ContainerDied","Data":"3a058ee96818f849314d27265a47c6c03285de78e0a25f9b799c331b83d881c0"} Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.765089 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.864115 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-util\") pod \"b30158de-69d4-4a93-9952-9b61fd08e5cd\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.864520 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwdbb\" (UniqueName: \"kubernetes.io/projected/b30158de-69d4-4a93-9952-9b61fd08e5cd-kube-api-access-wwdbb\") pod \"b30158de-69d4-4a93-9952-9b61fd08e5cd\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.865344 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-bundle\") pod \"b30158de-69d4-4a93-9952-9b61fd08e5cd\" (UID: \"b30158de-69d4-4a93-9952-9b61fd08e5cd\") " Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.865826 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-bundle" (OuterVolumeSpecName: "bundle") pod "b30158de-69d4-4a93-9952-9b61fd08e5cd" (UID: "b30158de-69d4-4a93-9952-9b61fd08e5cd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.870115 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30158de-69d4-4a93-9952-9b61fd08e5cd-kube-api-access-wwdbb" (OuterVolumeSpecName: "kube-api-access-wwdbb") pod "b30158de-69d4-4a93-9952-9b61fd08e5cd" (UID: "b30158de-69d4-4a93-9952-9b61fd08e5cd"). InnerVolumeSpecName "kube-api-access-wwdbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.875572 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-util" (OuterVolumeSpecName: "util") pod "b30158de-69d4-4a93-9952-9b61fd08e5cd" (UID: "b30158de-69d4-4a93-9952-9b61fd08e5cd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.966875 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-util\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.966914 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwdbb\" (UniqueName: \"kubernetes.io/projected/b30158de-69d4-4a93-9952-9b61fd08e5cd-kube-api-access-wwdbb\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:25 crc kubenswrapper[4751]: I1203 14:28:25.966930 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30158de-69d4-4a93-9952-9b61fd08e5cd-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:26 crc kubenswrapper[4751]: I1203 14:28:26.511301 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" event={"ID":"b30158de-69d4-4a93-9952-9b61fd08e5cd","Type":"ContainerDied","Data":"5aba0a62a9f08ad6f79ce80f3c605e5f8ac5fc645c6b293ef825aa9a3c6dff9a"} Dec 03 14:28:26 crc kubenswrapper[4751]: I1203 14:28:26.511408 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aba0a62a9f08ad6f79ce80f3c605e5f8ac5fc645c6b293ef825aa9a3c6dff9a" Dec 03 14:28:26 crc kubenswrapper[4751]: I1203 14:28:26.511417 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.391508 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk"] Dec 03 14:28:28 crc kubenswrapper[4751]: E1203 14:28:28.391784 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerName="extract" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.391802 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerName="extract" Dec 03 14:28:28 crc kubenswrapper[4751]: E1203 14:28:28.391823 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerName="util" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.391832 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerName="util" Dec 03 14:28:28 crc kubenswrapper[4751]: E1203 14:28:28.391849 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerName="pull" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.391857 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerName="pull" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.391985 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30158de-69d4-4a93-9952-9b61fd08e5cd" containerName="extract" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.392502 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.395182 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2qpr6" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.395302 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.395778 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.408646 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk"] Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.496996 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qhp\" (UniqueName: \"kubernetes.io/projected/65d126c6-b570-490c-bae2-71a3a7fa0832-kube-api-access-z9qhp\") pod \"nmstate-operator-5b5b58f5c8-4xzfk\" (UID: \"65d126c6-b570-490c-bae2-71a3a7fa0832\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.599099 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qhp\" (UniqueName: \"kubernetes.io/projected/65d126c6-b570-490c-bae2-71a3a7fa0832-kube-api-access-z9qhp\") pod \"nmstate-operator-5b5b58f5c8-4xzfk\" (UID: \"65d126c6-b570-490c-bae2-71a3a7fa0832\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.622642 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qhp\" (UniqueName: \"kubernetes.io/projected/65d126c6-b570-490c-bae2-71a3a7fa0832-kube-api-access-z9qhp\") pod \"nmstate-operator-5b5b58f5c8-4xzfk\" (UID: \"65d126c6-b570-490c-bae2-71a3a7fa0832\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk" Dec 03 14:28:28 crc kubenswrapper[4751]: I1203 14:28:28.709970 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk" Dec 03 14:28:29 crc kubenswrapper[4751]: I1203 14:28:29.154406 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk"] Dec 03 14:28:29 crc kubenswrapper[4751]: I1203 14:28:29.526357 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk" event={"ID":"65d126c6-b570-490c-bae2-71a3a7fa0832","Type":"ContainerStarted","Data":"af1f9b9cc54ff8575b9410717fc91efe166c94c0aa9a50ac1bf92c76d15b93b1"} Dec 03 14:28:31 crc kubenswrapper[4751]: I1203 14:28:31.555675 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk" event={"ID":"65d126c6-b570-490c-bae2-71a3a7fa0832","Type":"ContainerStarted","Data":"17590322582e7763cf90bd98683d6cd1424949786e49f54a715c177813fb42c6"} Dec 03 14:28:31 crc kubenswrapper[4751]: I1203 14:28:31.575681 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4xzfk" podStartSLOduration=1.476137147 podStartE2EDuration="3.575664791s" podCreationTimestamp="2025-12-03 14:28:28 +0000 UTC" firstStartedPulling="2025-12-03 14:28:29.165389139 +0000 UTC m=+916.153744356" lastFinishedPulling="2025-12-03 14:28:31.264916783 +0000 UTC m=+918.253272000" observedRunningTime="2025-12-03 14:28:31.571847707 +0000 UTC m=+918.560202934" watchObservedRunningTime="2025-12-03 14:28:31.575664791 +0000 UTC m=+918.564020008" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.362618 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm"] Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.363504 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.365973 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-9xjlj" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.385397 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr"] Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.386295 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.396302 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.403995 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-znb2g"] Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.405567 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.434636 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm"] Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.451587 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnk8l\" (UniqueName: \"kubernetes.io/projected/2dfea938-9795-4c3d-a42c-f4c7cbe57dae-kube-api-access-fnk8l\") pod \"nmstate-metrics-7f946cbc9-zmfgm\" (UID: \"2dfea938-9795-4c3d-a42c-f4c7cbe57dae\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.471536 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr"] Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.553101 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f41e25fd-38de-45a7-95dd-d0172caa1353-nmstate-lock\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.553154 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d0b754bd-d0ac-42e5-87a1-6f4132d926a9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7hdqr\" (UID: \"d0b754bd-d0ac-42e5-87a1-6f4132d926a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.553185 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f41e25fd-38de-45a7-95dd-d0172caa1353-ovs-socket\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.553444 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnk8l\" (UniqueName: \"kubernetes.io/projected/2dfea938-9795-4c3d-a42c-f4c7cbe57dae-kube-api-access-fnk8l\") pod \"nmstate-metrics-7f946cbc9-zmfgm\" (UID: \"2dfea938-9795-4c3d-a42c-f4c7cbe57dae\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.553538 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drwls\" (UniqueName: \"kubernetes.io/projected/d0b754bd-d0ac-42e5-87a1-6f4132d926a9-kube-api-access-drwls\") pod \"nmstate-webhook-5f6d4c5ccb-7hdqr\" (UID: \"d0b754bd-d0ac-42e5-87a1-6f4132d926a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.553566 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f41e25fd-38de-45a7-95dd-d0172caa1353-dbus-socket\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.553615 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcf9n\" (UniqueName: \"kubernetes.io/projected/f41e25fd-38de-45a7-95dd-d0172caa1353-kube-api-access-vcf9n\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.580022 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6"] Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.580932 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.585591 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.587391 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.587525 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-vszpw" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.589498 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnk8l\" (UniqueName: \"kubernetes.io/projected/2dfea938-9795-4c3d-a42c-f4c7cbe57dae-kube-api-access-fnk8l\") pod \"nmstate-metrics-7f946cbc9-zmfgm\" (UID: \"2dfea938-9795-4c3d-a42c-f4c7cbe57dae\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.654707 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f41e25fd-38de-45a7-95dd-d0172caa1353-ovs-socket\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.654826 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drwls\" (UniqueName: \"kubernetes.io/projected/d0b754bd-d0ac-42e5-87a1-6f4132d926a9-kube-api-access-drwls\") pod \"nmstate-webhook-5f6d4c5ccb-7hdqr\" (UID: \"d0b754bd-d0ac-42e5-87a1-6f4132d926a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.654836 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f41e25fd-38de-45a7-95dd-d0172caa1353-ovs-socket\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.654850 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f41e25fd-38de-45a7-95dd-d0172caa1353-dbus-socket\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.655066 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcf9n\" (UniqueName: \"kubernetes.io/projected/f41e25fd-38de-45a7-95dd-d0172caa1353-kube-api-access-vcf9n\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.655095 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f41e25fd-38de-45a7-95dd-d0172caa1353-nmstate-lock\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.655117 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f41e25fd-38de-45a7-95dd-d0172caa1353-dbus-socket\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.655155 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f41e25fd-38de-45a7-95dd-d0172caa1353-nmstate-lock\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.655201 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d0b754bd-d0ac-42e5-87a1-6f4132d926a9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7hdqr\" (UID: \"d0b754bd-d0ac-42e5-87a1-6f4132d926a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.659856 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d0b754bd-d0ac-42e5-87a1-6f4132d926a9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7hdqr\" (UID: \"d0b754bd-d0ac-42e5-87a1-6f4132d926a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.669831 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6"] Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.676951 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drwls\" (UniqueName: \"kubernetes.io/projected/d0b754bd-d0ac-42e5-87a1-6f4132d926a9-kube-api-access-drwls\") pod \"nmstate-webhook-5f6d4c5ccb-7hdqr\" (UID: \"d0b754bd-d0ac-42e5-87a1-6f4132d926a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.678846 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.705713 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.721044 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcf9n\" (UniqueName: \"kubernetes.io/projected/f41e25fd-38de-45a7-95dd-d0172caa1353-kube-api-access-vcf9n\") pod \"nmstate-handler-znb2g\" (UID: \"f41e25fd-38de-45a7-95dd-d0172caa1353\") " pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.755862 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40a56b56-06b1-4640-b817-4b22a08cdfea-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.755910 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpbtz\" (UniqueName: \"kubernetes.io/projected/40a56b56-06b1-4640-b817-4b22a08cdfea-kube-api-access-qpbtz\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.756092 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40a56b56-06b1-4640-b817-4b22a08cdfea-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.816057 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fd456cc64-295n5"] Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.817119 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.828469 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd456cc64-295n5"] Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.858801 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40a56b56-06b1-4640-b817-4b22a08cdfea-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.858839 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpbtz\" (UniqueName: \"kubernetes.io/projected/40a56b56-06b1-4640-b817-4b22a08cdfea-kube-api-access-qpbtz\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.858873 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40a56b56-06b1-4640-b817-4b22a08cdfea-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.859678 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40a56b56-06b1-4640-b817-4b22a08cdfea-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:32 crc kubenswrapper[4751]: E1203 14:28:32.859747 4751 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 03 14:28:32 crc kubenswrapper[4751]: E1203 14:28:32.859786 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a56b56-06b1-4640-b817-4b22a08cdfea-plugin-serving-cert podName:40a56b56-06b1-4640-b817-4b22a08cdfea nodeName:}" failed. No retries permitted until 2025-12-03 14:28:33.359772391 +0000 UTC m=+920.348127608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/40a56b56-06b1-4640-b817-4b22a08cdfea-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-bjml6" (UID: "40a56b56-06b1-4640-b817-4b22a08cdfea") : secret "plugin-serving-cert" not found Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.881758 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpbtz\" (UniqueName: \"kubernetes.io/projected/40a56b56-06b1-4640-b817-4b22a08cdfea-kube-api-access-qpbtz\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.960470 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-oauth-serving-cert\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.960529 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-console-config\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.960567 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-service-ca\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.960603 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-trusted-ca-bundle\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.960624 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn9x2\" (UniqueName: \"kubernetes.io/projected/1fd4536f-efa6-40a7-b329-cebf60907eb2-kube-api-access-nn9x2\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.960646 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1fd4536f-efa6-40a7-b329-cebf60907eb2-console-serving-cert\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:32 crc kubenswrapper[4751]: I1203 14:28:32.960786 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1fd4536f-efa6-40a7-b329-cebf60907eb2-console-oauth-config\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.017853 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:33 crc kubenswrapper[4751]: W1203 14:28:33.040031 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf41e25fd_38de_45a7_95dd_d0172caa1353.slice/crio-2b99b5341aa1c07de1b0ad049ad056654d1fdb830e4092dcff7bdddb74aadc0b WatchSource:0}: Error finding container 2b99b5341aa1c07de1b0ad049ad056654d1fdb830e4092dcff7bdddb74aadc0b: Status 404 returned error can't find the container with id 2b99b5341aa1c07de1b0ad049ad056654d1fdb830e4092dcff7bdddb74aadc0b Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.062918 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-service-ca\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.062979 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-trusted-ca-bundle\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.063001 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9x2\" (UniqueName: \"kubernetes.io/projected/1fd4536f-efa6-40a7-b329-cebf60907eb2-kube-api-access-nn9x2\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.063025 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1fd4536f-efa6-40a7-b329-cebf60907eb2-console-serving-cert\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.063044 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1fd4536f-efa6-40a7-b329-cebf60907eb2-console-oauth-config\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.063078 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-oauth-serving-cert\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.063105 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-console-config\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.064248 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-service-ca\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.064887 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-trusted-ca-bundle\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.064926 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-console-config\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.065212 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1fd4536f-efa6-40a7-b329-cebf60907eb2-oauth-serving-cert\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.066804 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm"] Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.068316 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1fd4536f-efa6-40a7-b329-cebf60907eb2-console-oauth-config\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.069682 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1fd4536f-efa6-40a7-b329-cebf60907eb2-console-serving-cert\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: W1203 14:28:33.073807 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dfea938_9795_4c3d_a42c_f4c7cbe57dae.slice/crio-d44ed0b15acce860d41b4ee744bcefb90c3654a25dff99f3703c038700400437 WatchSource:0}: Error finding container d44ed0b15acce860d41b4ee744bcefb90c3654a25dff99f3703c038700400437: Status 404 returned error can't find the container with id d44ed0b15acce860d41b4ee744bcefb90c3654a25dff99f3703c038700400437 Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.079706 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn9x2\" (UniqueName: \"kubernetes.io/projected/1fd4536f-efa6-40a7-b329-cebf60907eb2-kube-api-access-nn9x2\") pod \"console-6fd456cc64-295n5\" (UID: \"1fd4536f-efa6-40a7-b329-cebf60907eb2\") " pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.141033 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.219074 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr"] Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.346202 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd456cc64-295n5"] Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.366753 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40a56b56-06b1-4640-b817-4b22a08cdfea-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.371148 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40a56b56-06b1-4640-b817-4b22a08cdfea-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bjml6\" (UID: \"40a56b56-06b1-4640-b817-4b22a08cdfea\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.521860 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.568056 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd456cc64-295n5" event={"ID":"1fd4536f-efa6-40a7-b329-cebf60907eb2","Type":"ContainerStarted","Data":"f4e60a608c6adb657404cd721cbce353e208b2fdb9c45005ffea4a911b9ac844"} Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.568107 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd456cc64-295n5" event={"ID":"1fd4536f-efa6-40a7-b329-cebf60907eb2","Type":"ContainerStarted","Data":"cfe06d9fc4916fb922ce9b9b826c87ac14943d2e3c4905b7f5a8e5950155b8e1"} Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.569983 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm" event={"ID":"2dfea938-9795-4c3d-a42c-f4c7cbe57dae","Type":"ContainerStarted","Data":"d44ed0b15acce860d41b4ee744bcefb90c3654a25dff99f3703c038700400437"} Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.571691 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-znb2g" event={"ID":"f41e25fd-38de-45a7-95dd-d0172caa1353","Type":"ContainerStarted","Data":"2b99b5341aa1c07de1b0ad049ad056654d1fdb830e4092dcff7bdddb74aadc0b"} Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.573308 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" event={"ID":"d0b754bd-d0ac-42e5-87a1-6f4132d926a9","Type":"ContainerStarted","Data":"fb77bfb6242ff617bbd5712238f4ac4b0d047f51171ed1c086182d5037d2d60c"} Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.589387 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fd456cc64-295n5" podStartSLOduration=1.589361211 podStartE2EDuration="1.589361211s" podCreationTimestamp="2025-12-03 14:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:33.585380963 +0000 UTC m=+920.573736180" watchObservedRunningTime="2025-12-03 14:28:33.589361211 +0000 UTC m=+920.577716428" Dec 03 14:28:33 crc kubenswrapper[4751]: I1203 14:28:33.945989 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6"] Dec 03 14:28:33 crc kubenswrapper[4751]: W1203 14:28:33.953520 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40a56b56_06b1_4640_b817_4b22a08cdfea.slice/crio-53f9328fb99219f1c705271ca5e16436f7ae95c765b69c60ef75ec6567e88aad WatchSource:0}: Error finding container 53f9328fb99219f1c705271ca5e16436f7ae95c765b69c60ef75ec6567e88aad: Status 404 returned error can't find the container with id 53f9328fb99219f1c705271ca5e16436f7ae95c765b69c60ef75ec6567e88aad Dec 03 14:28:34 crc kubenswrapper[4751]: I1203 14:28:34.582041 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" event={"ID":"40a56b56-06b1-4640-b817-4b22a08cdfea","Type":"ContainerStarted","Data":"53f9328fb99219f1c705271ca5e16436f7ae95c765b69c60ef75ec6567e88aad"} Dec 03 14:28:36 crc kubenswrapper[4751]: I1203 14:28:36.595709 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm" event={"ID":"2dfea938-9795-4c3d-a42c-f4c7cbe57dae","Type":"ContainerStarted","Data":"584b65cc071d59fadf6253390996d50421efde5577318f069ef2a36b7b819c56"} Dec 03 14:28:36 crc kubenswrapper[4751]: I1203 14:28:36.598968 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-znb2g" event={"ID":"f41e25fd-38de-45a7-95dd-d0172caa1353","Type":"ContainerStarted","Data":"49153a681a4a9c8295f589524fcc82fd92986489d8215362e1a0f6b825b7d21f"} Dec 03 14:28:36 crc kubenswrapper[4751]: I1203 14:28:36.599776 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:36 crc kubenswrapper[4751]: I1203 14:28:36.604036 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" event={"ID":"d0b754bd-d0ac-42e5-87a1-6f4132d926a9","Type":"ContainerStarted","Data":"de5b9555c8751d32a73f1947c172b583ef712115761d9bb0eb1ef002572f0cc1"} Dec 03 14:28:36 crc kubenswrapper[4751]: I1203 14:28:36.604366 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:28:36 crc kubenswrapper[4751]: I1203 14:28:36.617125 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-znb2g" podStartSLOduration=2.073188209 podStartE2EDuration="4.617104742s" podCreationTimestamp="2025-12-03 14:28:32 +0000 UTC" firstStartedPulling="2025-12-03 14:28:33.042704578 +0000 UTC m=+920.031059795" lastFinishedPulling="2025-12-03 14:28:35.586621111 +0000 UTC m=+922.574976328" observedRunningTime="2025-12-03 14:28:36.615876628 +0000 UTC m=+923.604231865" watchObservedRunningTime="2025-12-03 14:28:36.617104742 +0000 UTC m=+923.605459959" Dec 03 14:28:36 crc kubenswrapper[4751]: I1203 14:28:36.631417 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" podStartSLOduration=2.256237489 podStartE2EDuration="4.631398082s" podCreationTimestamp="2025-12-03 14:28:32 +0000 UTC" firstStartedPulling="2025-12-03 14:28:33.228472973 +0000 UTC m=+920.216828190" lastFinishedPulling="2025-12-03 14:28:35.603633566 +0000 UTC m=+922.591988783" observedRunningTime="2025-12-03 14:28:36.631192827 +0000 UTC m=+923.619548054" watchObservedRunningTime="2025-12-03 14:28:36.631398082 +0000 UTC m=+923.619753319" Dec 03 14:28:37 crc kubenswrapper[4751]: I1203 14:28:37.629960 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" event={"ID":"40a56b56-06b1-4640-b817-4b22a08cdfea","Type":"ContainerStarted","Data":"c11d8aaff0c2a1854565ccdfbde6c3a6e9f88c38fe2cf57185d78bd824a7c91a"} Dec 03 14:28:37 crc kubenswrapper[4751]: I1203 14:28:37.675482 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bjml6" podStartSLOduration=2.83941249 podStartE2EDuration="5.675462704s" podCreationTimestamp="2025-12-03 14:28:32 +0000 UTC" firstStartedPulling="2025-12-03 14:28:33.956295205 +0000 UTC m=+920.944650422" lastFinishedPulling="2025-12-03 14:28:36.792345409 +0000 UTC m=+923.780700636" observedRunningTime="2025-12-03 14:28:37.670993492 +0000 UTC m=+924.659348709" watchObservedRunningTime="2025-12-03 14:28:37.675462704 +0000 UTC m=+924.663817921" Dec 03 14:28:38 crc kubenswrapper[4751]: I1203 14:28:38.635194 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm" event={"ID":"2dfea938-9795-4c3d-a42c-f4c7cbe57dae","Type":"ContainerStarted","Data":"7025f818eeacbce4efb00aff8a125419728a30a78a48cba325cfe9cd0adb932f"} Dec 03 14:28:38 crc kubenswrapper[4751]: I1203 14:28:38.654536 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmfgm" podStartSLOduration=1.8893148549999998 podStartE2EDuration="6.654516299s" podCreationTimestamp="2025-12-03 14:28:32 +0000 UTC" firstStartedPulling="2025-12-03 14:28:33.0764515 +0000 UTC m=+920.064806717" lastFinishedPulling="2025-12-03 14:28:37.841652944 +0000 UTC m=+924.830008161" observedRunningTime="2025-12-03 14:28:38.651975189 +0000 UTC m=+925.640330406" watchObservedRunningTime="2025-12-03 14:28:38.654516299 +0000 UTC m=+925.642871536" Dec 03 14:28:43 crc kubenswrapper[4751]: I1203 14:28:43.041785 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-znb2g" Dec 03 14:28:43 crc kubenswrapper[4751]: I1203 14:28:43.142092 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:43 crc kubenswrapper[4751]: I1203 14:28:43.142165 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:43 crc kubenswrapper[4751]: I1203 14:28:43.149889 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:43 crc kubenswrapper[4751]: I1203 14:28:43.672969 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:28:43 crc kubenswrapper[4751]: I1203 14:28:43.749686 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-djkdm"] Dec 03 14:28:52 crc kubenswrapper[4751]: I1203 14:28:52.710949 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7hdqr" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.164855 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9"] Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.166718 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.168274 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.181077 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9"] Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.316911 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.316979 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p5sv\" (UniqueName: \"kubernetes.io/projected/abecd931-d6b1-4ee6-83dc-eb78d75c076c-kube-api-access-6p5sv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.316997 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.417978 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p5sv\" (UniqueName: \"kubernetes.io/projected/abecd931-d6b1-4ee6-83dc-eb78d75c076c-kube-api-access-6p5sv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.418273 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.418443 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.418869 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.418885 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.434666 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p5sv\" (UniqueName: \"kubernetes.io/projected/abecd931-d6b1-4ee6-83dc-eb78d75c076c-kube-api-access-6p5sv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.499048 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.707482 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9"] Dec 03 14:29:07 crc kubenswrapper[4751]: I1203 14:29:07.827228 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" event={"ID":"abecd931-d6b1-4ee6-83dc-eb78d75c076c","Type":"ContainerStarted","Data":"515d8af939fb0315d66a84a0fa48be536902a2107c5aa27eb948e90236874a7f"} Dec 03 14:29:08 crc kubenswrapper[4751]: I1203 14:29:08.801627 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-djkdm" podUID="a112208b-e069-48e0-8bc4-d6c4e79052fc" containerName="console" containerID="cri-o://0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c" gracePeriod=15 Dec 03 14:29:08 crc kubenswrapper[4751]: I1203 14:29:08.835166 4751 generic.go:334] "Generic (PLEG): container finished" podID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerID="830b9ef0300d4ba1e86efb4841c620122fd65dbaee3adb05ebdf0bab3c574220" exitCode=0 Dec 03 14:29:08 crc kubenswrapper[4751]: I1203 14:29:08.835220 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" event={"ID":"abecd931-d6b1-4ee6-83dc-eb78d75c076c","Type":"ContainerDied","Data":"830b9ef0300d4ba1e86efb4841c620122fd65dbaee3adb05ebdf0bab3c574220"} Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.172241 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-djkdm_a112208b-e069-48e0-8bc4-d6c4e79052fc/console/0.log" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.172313 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.246834 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-config\") pod \"a112208b-e069-48e0-8bc4-d6c4e79052fc\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.246887 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh45m\" (UniqueName: \"kubernetes.io/projected/a112208b-e069-48e0-8bc4-d6c4e79052fc-kube-api-access-nh45m\") pod \"a112208b-e069-48e0-8bc4-d6c4e79052fc\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.246909 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-oauth-serving-cert\") pod \"a112208b-e069-48e0-8bc4-d6c4e79052fc\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.246997 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-oauth-config\") pod \"a112208b-e069-48e0-8bc4-d6c4e79052fc\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.247032 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-service-ca\") pod \"a112208b-e069-48e0-8bc4-d6c4e79052fc\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.247061 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-trusted-ca-bundle\") pod \"a112208b-e069-48e0-8bc4-d6c4e79052fc\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.247102 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-serving-cert\") pod \"a112208b-e069-48e0-8bc4-d6c4e79052fc\" (UID: \"a112208b-e069-48e0-8bc4-d6c4e79052fc\") " Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.248191 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-config" (OuterVolumeSpecName: "console-config") pod "a112208b-e069-48e0-8bc4-d6c4e79052fc" (UID: "a112208b-e069-48e0-8bc4-d6c4e79052fc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.248306 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-service-ca" (OuterVolumeSpecName: "service-ca") pod "a112208b-e069-48e0-8bc4-d6c4e79052fc" (UID: "a112208b-e069-48e0-8bc4-d6c4e79052fc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.248395 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a112208b-e069-48e0-8bc4-d6c4e79052fc" (UID: "a112208b-e069-48e0-8bc4-d6c4e79052fc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.248521 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a112208b-e069-48e0-8bc4-d6c4e79052fc" (UID: "a112208b-e069-48e0-8bc4-d6c4e79052fc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.252628 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a112208b-e069-48e0-8bc4-d6c4e79052fc" (UID: "a112208b-e069-48e0-8bc4-d6c4e79052fc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.252943 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a112208b-e069-48e0-8bc4-d6c4e79052fc" (UID: "a112208b-e069-48e0-8bc4-d6c4e79052fc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.253186 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a112208b-e069-48e0-8bc4-d6c4e79052fc-kube-api-access-nh45m" (OuterVolumeSpecName: "kube-api-access-nh45m") pod "a112208b-e069-48e0-8bc4-d6c4e79052fc" (UID: "a112208b-e069-48e0-8bc4-d6c4e79052fc"). InnerVolumeSpecName "kube-api-access-nh45m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.348137 4751 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.348344 4751 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.348435 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh45m\" (UniqueName: \"kubernetes.io/projected/a112208b-e069-48e0-8bc4-d6c4e79052fc-kube-api-access-nh45m\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.348494 4751 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.348546 4751 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a112208b-e069-48e0-8bc4-d6c4e79052fc-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.348605 4751 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.348658 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a112208b-e069-48e0-8bc4-d6c4e79052fc-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.845490 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-djkdm_a112208b-e069-48e0-8bc4-d6c4e79052fc/console/0.log" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.845545 4751 generic.go:334] "Generic (PLEG): container finished" podID="a112208b-e069-48e0-8bc4-d6c4e79052fc" containerID="0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c" exitCode=2 Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.845584 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-djkdm" event={"ID":"a112208b-e069-48e0-8bc4-d6c4e79052fc","Type":"ContainerDied","Data":"0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c"} Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.845600 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-djkdm" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.845616 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-djkdm" event={"ID":"a112208b-e069-48e0-8bc4-d6c4e79052fc","Type":"ContainerDied","Data":"cb27c8c07a1adac88b4af2dbc548d9f692e228595028b294d10e56f342fe57f7"} Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.845638 4751 scope.go:117] "RemoveContainer" containerID="0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.872165 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-djkdm"] Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.878679 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-djkdm"] Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.879258 4751 scope.go:117] "RemoveContainer" containerID="0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c" Dec 03 14:29:09 crc kubenswrapper[4751]: E1203 14:29:09.879781 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c\": container with ID starting with 0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c not found: ID does not exist" containerID="0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c" Dec 03 14:29:09 crc kubenswrapper[4751]: I1203 14:29:09.879856 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c"} err="failed to get container status \"0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c\": rpc error: code = NotFound desc = could not find container \"0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c\": container with ID starting with 0524f4c77b801ec459178348a0bff6e2c35fd49936c4815280647398d22f665c not found: ID does not exist" Dec 03 14:29:10 crc kubenswrapper[4751]: I1203 14:29:10.856012 4751 generic.go:334] "Generic (PLEG): container finished" podID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerID="2b645669b3365e6a3a387bb523445824660b2183570d01514a75b13061c8c92c" exitCode=0 Dec 03 14:29:10 crc kubenswrapper[4751]: I1203 14:29:10.856051 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" event={"ID":"abecd931-d6b1-4ee6-83dc-eb78d75c076c","Type":"ContainerDied","Data":"2b645669b3365e6a3a387bb523445824660b2183570d01514a75b13061c8c92c"} Dec 03 14:29:11 crc kubenswrapper[4751]: I1203 14:29:11.327966 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a112208b-e069-48e0-8bc4-d6c4e79052fc" path="/var/lib/kubelet/pods/a112208b-e069-48e0-8bc4-d6c4e79052fc/volumes" Dec 03 14:29:11 crc kubenswrapper[4751]: I1203 14:29:11.863580 4751 generic.go:334] "Generic (PLEG): container finished" podID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerID="7f5f882ecbf50e7f85e0c3ddd8fdc521681881ef538ceec8b76170d5d0e6731f" exitCode=0 Dec 03 14:29:11 crc kubenswrapper[4751]: I1203 14:29:11.863621 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" event={"ID":"abecd931-d6b1-4ee6-83dc-eb78d75c076c","Type":"ContainerDied","Data":"7f5f882ecbf50e7f85e0c3ddd8fdc521681881ef538ceec8b76170d5d0e6731f"} Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.173761 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.300213 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-util\") pod \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.300263 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p5sv\" (UniqueName: \"kubernetes.io/projected/abecd931-d6b1-4ee6-83dc-eb78d75c076c-kube-api-access-6p5sv\") pod \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.300303 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-bundle\") pod \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\" (UID: \"abecd931-d6b1-4ee6-83dc-eb78d75c076c\") " Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.301266 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-bundle" (OuterVolumeSpecName: "bundle") pod "abecd931-d6b1-4ee6-83dc-eb78d75c076c" (UID: "abecd931-d6b1-4ee6-83dc-eb78d75c076c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.307684 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abecd931-d6b1-4ee6-83dc-eb78d75c076c-kube-api-access-6p5sv" (OuterVolumeSpecName: "kube-api-access-6p5sv") pod "abecd931-d6b1-4ee6-83dc-eb78d75c076c" (UID: "abecd931-d6b1-4ee6-83dc-eb78d75c076c"). InnerVolumeSpecName "kube-api-access-6p5sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.401534 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p5sv\" (UniqueName: \"kubernetes.io/projected/abecd931-d6b1-4ee6-83dc-eb78d75c076c-kube-api-access-6p5sv\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.401568 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.566486 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-util" (OuterVolumeSpecName: "util") pod "abecd931-d6b1-4ee6-83dc-eb78d75c076c" (UID: "abecd931-d6b1-4ee6-83dc-eb78d75c076c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.604294 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abecd931-d6b1-4ee6-83dc-eb78d75c076c-util\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.879046 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" event={"ID":"abecd931-d6b1-4ee6-83dc-eb78d75c076c","Type":"ContainerDied","Data":"515d8af939fb0315d66a84a0fa48be536902a2107c5aa27eb948e90236874a7f"} Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.879425 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="515d8af939fb0315d66a84a0fa48be536902a2107c5aa27eb948e90236874a7f" Dec 03 14:29:13 crc kubenswrapper[4751]: I1203 14:29:13.879092 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9" Dec 03 14:29:15 crc kubenswrapper[4751]: E1203 14:29:15.361644 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.884663 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt"] Dec 03 14:29:21 crc kubenswrapper[4751]: E1203 14:29:21.885472 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a112208b-e069-48e0-8bc4-d6c4e79052fc" containerName="console" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.885489 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a112208b-e069-48e0-8bc4-d6c4e79052fc" containerName="console" Dec 03 14:29:21 crc kubenswrapper[4751]: E1203 14:29:21.885505 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerName="util" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.885512 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerName="util" Dec 03 14:29:21 crc kubenswrapper[4751]: E1203 14:29:21.885524 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerName="pull" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.885532 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerName="pull" Dec 03 14:29:21 crc kubenswrapper[4751]: E1203 14:29:21.885555 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerName="extract" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.885564 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerName="extract" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.885686 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a112208b-e069-48e0-8bc4-d6c4e79052fc" containerName="console" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.885704 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="abecd931-d6b1-4ee6-83dc-eb78d75c076c" containerName="extract" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.886208 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.891039 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.891223 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.891273 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.891403 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7f2fx" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.891470 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 14:29:21 crc kubenswrapper[4751]: I1203 14:29:21.909947 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt"] Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.015015 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14960a87-3612-433e-bd1e-b548b0118a2c-apiservice-cert\") pod \"metallb-operator-controller-manager-75cdb5998d-hbntt\" (UID: \"14960a87-3612-433e-bd1e-b548b0118a2c\") " pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.015101 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2zl6\" (UniqueName: \"kubernetes.io/projected/14960a87-3612-433e-bd1e-b548b0118a2c-kube-api-access-j2zl6\") pod \"metallb-operator-controller-manager-75cdb5998d-hbntt\" (UID: \"14960a87-3612-433e-bd1e-b548b0118a2c\") " pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.015139 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14960a87-3612-433e-bd1e-b548b0118a2c-webhook-cert\") pod \"metallb-operator-controller-manager-75cdb5998d-hbntt\" (UID: \"14960a87-3612-433e-bd1e-b548b0118a2c\") " pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.117028 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14960a87-3612-433e-bd1e-b548b0118a2c-apiservice-cert\") pod \"metallb-operator-controller-manager-75cdb5998d-hbntt\" (UID: \"14960a87-3612-433e-bd1e-b548b0118a2c\") " pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.117099 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2zl6\" (UniqueName: \"kubernetes.io/projected/14960a87-3612-433e-bd1e-b548b0118a2c-kube-api-access-j2zl6\") pod \"metallb-operator-controller-manager-75cdb5998d-hbntt\" (UID: \"14960a87-3612-433e-bd1e-b548b0118a2c\") " pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.117125 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14960a87-3612-433e-bd1e-b548b0118a2c-webhook-cert\") pod \"metallb-operator-controller-manager-75cdb5998d-hbntt\" (UID: \"14960a87-3612-433e-bd1e-b548b0118a2c\") " pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.122678 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14960a87-3612-433e-bd1e-b548b0118a2c-apiservice-cert\") pod \"metallb-operator-controller-manager-75cdb5998d-hbntt\" (UID: \"14960a87-3612-433e-bd1e-b548b0118a2c\") " pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.122854 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14960a87-3612-433e-bd1e-b548b0118a2c-webhook-cert\") pod \"metallb-operator-controller-manager-75cdb5998d-hbntt\" (UID: \"14960a87-3612-433e-bd1e-b548b0118a2c\") " pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.135005 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2zl6\" (UniqueName: \"kubernetes.io/projected/14960a87-3612-433e-bd1e-b548b0118a2c-kube-api-access-j2zl6\") pod \"metallb-operator-controller-manager-75cdb5998d-hbntt\" (UID: \"14960a87-3612-433e-bd1e-b548b0118a2c\") " pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.205684 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.254833 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk"] Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.256529 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:22 crc kubenswrapper[4751]: W1203 14:29:22.263990 4751 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-service-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 03 14:29:22 crc kubenswrapper[4751]: E1203 14:29:22.264476 4751 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:29:22 crc kubenswrapper[4751]: W1203 14:29:22.264615 4751 reflector.go:561] object-"metallb-system"/"metallb-webhook-cert": failed to list *v1.Secret: secrets "metallb-webhook-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 03 14:29:22 crc kubenswrapper[4751]: E1203 14:29:22.264707 4751 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-webhook-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-webhook-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:29:22 crc kubenswrapper[4751]: W1203 14:29:22.264857 4751 reflector.go:561] object-"metallb-system"/"controller-dockercfg-758nx": failed to list *v1.Secret: secrets "controller-dockercfg-758nx" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Dec 03 14:29:22 crc kubenswrapper[4751]: E1203 14:29:22.264946 4751 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-dockercfg-758nx\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-dockercfg-758nx\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.378642 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk"] Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.423210 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8jsh\" (UniqueName: \"kubernetes.io/projected/ab9be016-ca60-474a-85d3-7c3ca149e87d-kube-api-access-c8jsh\") pod \"metallb-operator-webhook-server-7cf457fc-2z2zk\" (UID: \"ab9be016-ca60-474a-85d3-7c3ca149e87d\") " pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.423286 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab9be016-ca60-474a-85d3-7c3ca149e87d-webhook-cert\") pod \"metallb-operator-webhook-server-7cf457fc-2z2zk\" (UID: \"ab9be016-ca60-474a-85d3-7c3ca149e87d\") " pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.423312 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab9be016-ca60-474a-85d3-7c3ca149e87d-apiservice-cert\") pod \"metallb-operator-webhook-server-7cf457fc-2z2zk\" (UID: \"ab9be016-ca60-474a-85d3-7c3ca149e87d\") " pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.525270 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8jsh\" (UniqueName: \"kubernetes.io/projected/ab9be016-ca60-474a-85d3-7c3ca149e87d-kube-api-access-c8jsh\") pod \"metallb-operator-webhook-server-7cf457fc-2z2zk\" (UID: \"ab9be016-ca60-474a-85d3-7c3ca149e87d\") " pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.525395 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab9be016-ca60-474a-85d3-7c3ca149e87d-apiservice-cert\") pod \"metallb-operator-webhook-server-7cf457fc-2z2zk\" (UID: \"ab9be016-ca60-474a-85d3-7c3ca149e87d\") " pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.525502 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab9be016-ca60-474a-85d3-7c3ca149e87d-webhook-cert\") pod \"metallb-operator-webhook-server-7cf457fc-2z2zk\" (UID: \"ab9be016-ca60-474a-85d3-7c3ca149e87d\") " pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.544614 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8jsh\" (UniqueName: \"kubernetes.io/projected/ab9be016-ca60-474a-85d3-7c3ca149e87d-kube-api-access-c8jsh\") pod \"metallb-operator-webhook-server-7cf457fc-2z2zk\" (UID: \"ab9be016-ca60-474a-85d3-7c3ca149e87d\") " pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.827196 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt"] Dec 03 14:29:22 crc kubenswrapper[4751]: W1203 14:29:22.831851 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14960a87_3612_433e_bd1e_b548b0118a2c.slice/crio-d0177120a954fff7ed47252c57e86c0e3ae592b61b999c29ca44f060c3792018 WatchSource:0}: Error finding container d0177120a954fff7ed47252c57e86c0e3ae592b61b999c29ca44f060c3792018: Status 404 returned error can't find the container with id d0177120a954fff7ed47252c57e86c0e3ae592b61b999c29ca44f060c3792018 Dec 03 14:29:22 crc kubenswrapper[4751]: I1203 14:29:22.928022 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" event={"ID":"14960a87-3612-433e-bd1e-b548b0118a2c","Type":"ContainerStarted","Data":"d0177120a954fff7ed47252c57e86c0e3ae592b61b999c29ca44f060c3792018"} Dec 03 14:29:23 crc kubenswrapper[4751]: I1203 14:29:23.254571 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-758nx" Dec 03 14:29:23 crc kubenswrapper[4751]: I1203 14:29:23.357273 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 14:29:23 crc kubenswrapper[4751]: I1203 14:29:23.370823 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab9be016-ca60-474a-85d3-7c3ca149e87d-webhook-cert\") pod \"metallb-operator-webhook-server-7cf457fc-2z2zk\" (UID: \"ab9be016-ca60-474a-85d3-7c3ca149e87d\") " pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:23 crc kubenswrapper[4751]: I1203 14:29:23.371086 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab9be016-ca60-474a-85d3-7c3ca149e87d-apiservice-cert\") pod \"metallb-operator-webhook-server-7cf457fc-2z2zk\" (UID: \"ab9be016-ca60-474a-85d3-7c3ca149e87d\") " pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:23 crc kubenswrapper[4751]: I1203 14:29:23.385497 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 14:29:23 crc kubenswrapper[4751]: I1203 14:29:23.503317 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:23 crc kubenswrapper[4751]: I1203 14:29:23.740052 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk"] Dec 03 14:29:23 crc kubenswrapper[4751]: W1203 14:29:23.745964 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab9be016_ca60_474a_85d3_7c3ca149e87d.slice/crio-ddb3aa0d2c180a713c47c34e4ce12157b50d40bdb46c9c9431b3024966c7c0cc WatchSource:0}: Error finding container ddb3aa0d2c180a713c47c34e4ce12157b50d40bdb46c9c9431b3024966c7c0cc: Status 404 returned error can't find the container with id ddb3aa0d2c180a713c47c34e4ce12157b50d40bdb46c9c9431b3024966c7c0cc Dec 03 14:29:23 crc kubenswrapper[4751]: I1203 14:29:23.941684 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" event={"ID":"ab9be016-ca60-474a-85d3-7c3ca149e87d","Type":"ContainerStarted","Data":"ddb3aa0d2c180a713c47c34e4ce12157b50d40bdb46c9c9431b3024966c7c0cc"} Dec 03 14:29:26 crc kubenswrapper[4751]: I1203 14:29:26.989318 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" event={"ID":"14960a87-3612-433e-bd1e-b548b0118a2c","Type":"ContainerStarted","Data":"f251c927cf63330a65f1869f0502f8232b6018f84a656d04fe2d0ed9983cacff"} Dec 03 14:29:26 crc kubenswrapper[4751]: I1203 14:29:26.990002 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:29:27 crc kubenswrapper[4751]: I1203 14:29:27.009523 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" podStartSLOduration=2.431038639 podStartE2EDuration="6.009502845s" podCreationTimestamp="2025-12-03 14:29:21 +0000 UTC" firstStartedPulling="2025-12-03 14:29:22.834159184 +0000 UTC m=+969.822514401" lastFinishedPulling="2025-12-03 14:29:26.41262339 +0000 UTC m=+973.400978607" observedRunningTime="2025-12-03 14:29:27.00747882 +0000 UTC m=+973.995834047" watchObservedRunningTime="2025-12-03 14:29:27.009502845 +0000 UTC m=+973.997858062" Dec 03 14:29:29 crc kubenswrapper[4751]: I1203 14:29:29.001694 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" event={"ID":"ab9be016-ca60-474a-85d3-7c3ca149e87d","Type":"ContainerStarted","Data":"05f3a9b8eff915fcbbd7c1db44df77c5ed2e5de29917ee8fa1e818a8868a5964"} Dec 03 14:29:29 crc kubenswrapper[4751]: I1203 14:29:29.001992 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:29:29 crc kubenswrapper[4751]: I1203 14:29:29.020938 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" podStartSLOduration=2.395196868 podStartE2EDuration="7.020916202s" podCreationTimestamp="2025-12-03 14:29:22 +0000 UTC" firstStartedPulling="2025-12-03 14:29:23.748949975 +0000 UTC m=+970.737305192" lastFinishedPulling="2025-12-03 14:29:28.374669309 +0000 UTC m=+975.363024526" observedRunningTime="2025-12-03 14:29:29.018473275 +0000 UTC m=+976.006828502" watchObservedRunningTime="2025-12-03 14:29:29.020916202 +0000 UTC m=+976.009271429" Dec 03 14:29:43 crc kubenswrapper[4751]: I1203 14:29:43.508692 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7cf457fc-2z2zk" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.175463 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm"] Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.176903 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.180308 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.180971 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.188123 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm"] Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.311454 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58kdm\" (UniqueName: \"kubernetes.io/projected/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-kube-api-access-58kdm\") pod \"collect-profiles-29412870-7n4fm\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.311624 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-secret-volume\") pod \"collect-profiles-29412870-7n4fm\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.311676 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-config-volume\") pod \"collect-profiles-29412870-7n4fm\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.413121 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58kdm\" (UniqueName: \"kubernetes.io/projected/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-kube-api-access-58kdm\") pod \"collect-profiles-29412870-7n4fm\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.413228 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-secret-volume\") pod \"collect-profiles-29412870-7n4fm\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.413257 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-config-volume\") pod \"collect-profiles-29412870-7n4fm\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.414757 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-config-volume\") pod \"collect-profiles-29412870-7n4fm\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.419176 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-secret-volume\") pod \"collect-profiles-29412870-7n4fm\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.429885 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58kdm\" (UniqueName: \"kubernetes.io/projected/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-kube-api-access-58kdm\") pod \"collect-profiles-29412870-7n4fm\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.508033 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:00 crc kubenswrapper[4751]: I1203 14:30:00.924807 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm"] Dec 03 14:30:01 crc kubenswrapper[4751]: I1203 14:30:01.193546 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" event={"ID":"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1","Type":"ContainerStarted","Data":"5b2d03adb8143132bb0e6465f53949ad884338332ac035d6d13370cbe72635d9"} Dec 03 14:30:01 crc kubenswrapper[4751]: I1203 14:30:01.193940 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" event={"ID":"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1","Type":"ContainerStarted","Data":"1763927722c9e3fb0e57587cc74cf8dd3e915f9246d6d739c1fbba580b9a9406"} Dec 03 14:30:01 crc kubenswrapper[4751]: I1203 14:30:01.211830 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" podStartSLOduration=1.211807573 podStartE2EDuration="1.211807573s" podCreationTimestamp="2025-12-03 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:30:01.207041703 +0000 UTC m=+1008.195396920" watchObservedRunningTime="2025-12-03 14:30:01.211807573 +0000 UTC m=+1008.200162790" Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.200148 4751 generic.go:334] "Generic (PLEG): container finished" podID="77fff4ad-5068-4b2b-a2ad-b700d3dc06e1" containerID="5b2d03adb8143132bb0e6465f53949ad884338332ac035d6d13370cbe72635d9" exitCode=0 Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.200188 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" event={"ID":"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1","Type":"ContainerDied","Data":"5b2d03adb8143132bb0e6465f53949ad884338332ac035d6d13370cbe72635d9"} Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.208040 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.917953 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-4fx8c"] Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.921005 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.923276 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.923824 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5"] Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.924097 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-cf6pl" Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.924770 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.924781 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.926006 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.950835 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5"] Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.993001 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-rgj2q"] Dec 03 14:30:02 crc kubenswrapper[4751]: I1203 14:30:02.998693 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.003622 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.003838 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.003849 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.004311 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hgsxg" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.030434 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-92ghf"] Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.031673 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.036759 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048041 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwdxf\" (UniqueName: \"kubernetes.io/projected/0d711f1f-ab39-4d20-951b-398bd5c7226c-kube-api-access-hwdxf\") pod \"frr-k8s-webhook-server-7fcb986d4-9zqk5\" (UID: \"0d711f1f-ab39-4d20-951b-398bd5c7226c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048082 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-frr-startup\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048105 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-reloader\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048147 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qm7z\" (UniqueName: \"kubernetes.io/projected/3afabeea-33c4-4bed-a2ca-440c78ff75ad-kube-api-access-5qm7z\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048176 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-frr-sockets\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048202 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-metrics-certs\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048301 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-memberlist\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048345 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d711f1f-ab39-4d20-951b-398bd5c7226c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-9zqk5\" (UID: \"0d711f1f-ab39-4d20-951b-398bd5c7226c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048450 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-metrics-certs\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048507 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-frr-conf\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048563 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3afabeea-33c4-4bed-a2ca-440c78ff75ad-metallb-excludel2\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048595 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx2dd\" (UniqueName: \"kubernetes.io/projected/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-kube-api-access-kx2dd\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.048629 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-metrics\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.057431 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-92ghf"] Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149592 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwdxf\" (UniqueName: \"kubernetes.io/projected/0d711f1f-ab39-4d20-951b-398bd5c7226c-kube-api-access-hwdxf\") pod \"frr-k8s-webhook-server-7fcb986d4-9zqk5\" (UID: \"0d711f1f-ab39-4d20-951b-398bd5c7226c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149636 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-frr-startup\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149655 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-reloader\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149673 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qm7z\" (UniqueName: \"kubernetes.io/projected/3afabeea-33c4-4bed-a2ca-440c78ff75ad-kube-api-access-5qm7z\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149694 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4cd8b40c-f374-4b29-96a6-94137d11fe90-metrics-certs\") pod \"controller-f8648f98b-92ghf\" (UID: \"4cd8b40c-f374-4b29-96a6-94137d11fe90\") " pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149717 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-frr-sockets\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149739 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-metrics-certs\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149761 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-memberlist\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149777 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d711f1f-ab39-4d20-951b-398bd5c7226c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-9zqk5\" (UID: \"0d711f1f-ab39-4d20-951b-398bd5c7226c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149790 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cd8b40c-f374-4b29-96a6-94137d11fe90-cert\") pod \"controller-f8648f98b-92ghf\" (UID: \"4cd8b40c-f374-4b29-96a6-94137d11fe90\") " pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149809 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-metrics-certs\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149825 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j9hw\" (UniqueName: \"kubernetes.io/projected/4cd8b40c-f374-4b29-96a6-94137d11fe90-kube-api-access-4j9hw\") pod \"controller-f8648f98b-92ghf\" (UID: \"4cd8b40c-f374-4b29-96a6-94137d11fe90\") " pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149844 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-frr-conf\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149868 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3afabeea-33c4-4bed-a2ca-440c78ff75ad-metallb-excludel2\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149887 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx2dd\" (UniqueName: \"kubernetes.io/projected/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-kube-api-access-kx2dd\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.149903 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-metrics\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.151290 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-frr-startup\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.151523 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-reloader\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.152062 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-frr-sockets\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: E1203 14:30:03.152554 4751 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 14:30:03 crc kubenswrapper[4751]: E1203 14:30:03.152619 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-memberlist podName:3afabeea-33c4-4bed-a2ca-440c78ff75ad nodeName:}" failed. No retries permitted until 2025-12-03 14:30:03.652601081 +0000 UTC m=+1010.640956298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-memberlist") pod "speaker-rgj2q" (UID: "3afabeea-33c4-4bed-a2ca-440c78ff75ad") : secret "metallb-memberlist" not found Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.152616 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-metrics\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.152743 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-frr-conf\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: E1203 14:30:03.152833 4751 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 03 14:30:03 crc kubenswrapper[4751]: E1203 14:30:03.152886 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-metrics-certs podName:3afabeea-33c4-4bed-a2ca-440c78ff75ad nodeName:}" failed. No retries permitted until 2025-12-03 14:30:03.652868839 +0000 UTC m=+1010.641224056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-metrics-certs") pod "speaker-rgj2q" (UID: "3afabeea-33c4-4bed-a2ca-440c78ff75ad") : secret "speaker-certs-secret" not found Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.153061 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3afabeea-33c4-4bed-a2ca-440c78ff75ad-metallb-excludel2\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.159074 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d711f1f-ab39-4d20-951b-398bd5c7226c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-9zqk5\" (UID: \"0d711f1f-ab39-4d20-951b-398bd5c7226c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.159463 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-metrics-certs\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.168037 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qm7z\" (UniqueName: \"kubernetes.io/projected/3afabeea-33c4-4bed-a2ca-440c78ff75ad-kube-api-access-5qm7z\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.168504 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx2dd\" (UniqueName: \"kubernetes.io/projected/20dc27ab-4ebf-46d7-8f6e-8c703e66fa92-kube-api-access-kx2dd\") pod \"frr-k8s-4fx8c\" (UID: \"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92\") " pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.170045 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwdxf\" (UniqueName: \"kubernetes.io/projected/0d711f1f-ab39-4d20-951b-398bd5c7226c-kube-api-access-hwdxf\") pod \"frr-k8s-webhook-server-7fcb986d4-9zqk5\" (UID: \"0d711f1f-ab39-4d20-951b-398bd5c7226c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.242188 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.250979 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.251627 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4cd8b40c-f374-4b29-96a6-94137d11fe90-metrics-certs\") pod \"controller-f8648f98b-92ghf\" (UID: \"4cd8b40c-f374-4b29-96a6-94137d11fe90\") " pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.251692 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cd8b40c-f374-4b29-96a6-94137d11fe90-cert\") pod \"controller-f8648f98b-92ghf\" (UID: \"4cd8b40c-f374-4b29-96a6-94137d11fe90\") " pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.251741 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j9hw\" (UniqueName: \"kubernetes.io/projected/4cd8b40c-f374-4b29-96a6-94137d11fe90-kube-api-access-4j9hw\") pod \"controller-f8648f98b-92ghf\" (UID: \"4cd8b40c-f374-4b29-96a6-94137d11fe90\") " pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.257312 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cd8b40c-f374-4b29-96a6-94137d11fe90-cert\") pod \"controller-f8648f98b-92ghf\" (UID: \"4cd8b40c-f374-4b29-96a6-94137d11fe90\") " pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.257511 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4cd8b40c-f374-4b29-96a6-94137d11fe90-metrics-certs\") pod \"controller-f8648f98b-92ghf\" (UID: \"4cd8b40c-f374-4b29-96a6-94137d11fe90\") " pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.269422 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j9hw\" (UniqueName: \"kubernetes.io/projected/4cd8b40c-f374-4b29-96a6-94137d11fe90-kube-api-access-4j9hw\") pod \"controller-f8648f98b-92ghf\" (UID: \"4cd8b40c-f374-4b29-96a6-94137d11fe90\") " pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.363613 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.500298 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.554040 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-config-volume\") pod \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.554099 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58kdm\" (UniqueName: \"kubernetes.io/projected/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-kube-api-access-58kdm\") pod \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.554163 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-secret-volume\") pod \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\" (UID: \"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1\") " Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.556054 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-config-volume" (OuterVolumeSpecName: "config-volume") pod "77fff4ad-5068-4b2b-a2ad-b700d3dc06e1" (UID: "77fff4ad-5068-4b2b-a2ad-b700d3dc06e1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.560539 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "77fff4ad-5068-4b2b-a2ad-b700d3dc06e1" (UID: "77fff4ad-5068-4b2b-a2ad-b700d3dc06e1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.561611 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-kube-api-access-58kdm" (OuterVolumeSpecName: "kube-api-access-58kdm") pod "77fff4ad-5068-4b2b-a2ad-b700d3dc06e1" (UID: "77fff4ad-5068-4b2b-a2ad-b700d3dc06e1"). InnerVolumeSpecName "kube-api-access-58kdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.655710 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-memberlist\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.655769 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-metrics-certs\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.655864 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58kdm\" (UniqueName: \"kubernetes.io/projected/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-kube-api-access-58kdm\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.655876 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.655886 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:03 crc kubenswrapper[4751]: E1203 14:30:03.655947 4751 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 14:30:03 crc kubenswrapper[4751]: E1203 14:30:03.656041 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-memberlist podName:3afabeea-33c4-4bed-a2ca-440c78ff75ad nodeName:}" failed. No retries permitted until 2025-12-03 14:30:04.656019294 +0000 UTC m=+1011.644374541 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-memberlist") pod "speaker-rgj2q" (UID: "3afabeea-33c4-4bed-a2ca-440c78ff75ad") : secret "metallb-memberlist" not found Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.662902 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-metrics-certs\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.694221 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5"] Dec 03 14:30:03 crc kubenswrapper[4751]: W1203 14:30:03.695046 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d711f1f_ab39_4d20_951b_398bd5c7226c.slice/crio-07f960e00c19ffbec51d8d0454686dd4e7a60f2c1797f8cfdf180b83d33e0a3f WatchSource:0}: Error finding container 07f960e00c19ffbec51d8d0454686dd4e7a60f2c1797f8cfdf180b83d33e0a3f: Status 404 returned error can't find the container with id 07f960e00c19ffbec51d8d0454686dd4e7a60f2c1797f8cfdf180b83d33e0a3f Dec 03 14:30:03 crc kubenswrapper[4751]: I1203 14:30:03.833986 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-92ghf"] Dec 03 14:30:03 crc kubenswrapper[4751]: W1203 14:30:03.839684 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cd8b40c_f374_4b29_96a6_94137d11fe90.slice/crio-81d1711d93fdf78919617595a4dd4109eb7d87ac239e7818b9bfc674a90dd1b1 WatchSource:0}: Error finding container 81d1711d93fdf78919617595a4dd4109eb7d87ac239e7818b9bfc674a90dd1b1: Status 404 returned error can't find the container with id 81d1711d93fdf78919617595a4dd4109eb7d87ac239e7818b9bfc674a90dd1b1 Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.216014 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.216045 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm" event={"ID":"77fff4ad-5068-4b2b-a2ad-b700d3dc06e1","Type":"ContainerDied","Data":"1763927722c9e3fb0e57587cc74cf8dd3e915f9246d6d739c1fbba580b9a9406"} Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.216533 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1763927722c9e3fb0e57587cc74cf8dd3e915f9246d6d739c1fbba580b9a9406" Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.221212 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" event={"ID":"0d711f1f-ab39-4d20-951b-398bd5c7226c","Type":"ContainerStarted","Data":"07f960e00c19ffbec51d8d0454686dd4e7a60f2c1797f8cfdf180b83d33e0a3f"} Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.222940 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerStarted","Data":"1cb519a376e4356d81eddf7e85234a4377a26defa4e32dbe21557542be466c3c"} Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.225906 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-92ghf" event={"ID":"4cd8b40c-f374-4b29-96a6-94137d11fe90","Type":"ContainerStarted","Data":"1059ec79b425928b81772890458820100f8db25ca6fad9aa07692c41aa6d59b2"} Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.225954 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-92ghf" event={"ID":"4cd8b40c-f374-4b29-96a6-94137d11fe90","Type":"ContainerStarted","Data":"ee52b839d98c7c08e350fd2831590a039040690653f72b03d39008dd5d1d96a5"} Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.225970 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-92ghf" event={"ID":"4cd8b40c-f374-4b29-96a6-94137d11fe90","Type":"ContainerStarted","Data":"81d1711d93fdf78919617595a4dd4109eb7d87ac239e7818b9bfc674a90dd1b1"} Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.226282 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.242404 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-92ghf" podStartSLOduration=1.24231087 podStartE2EDuration="1.24231087s" podCreationTimestamp="2025-12-03 14:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:30:04.241478637 +0000 UTC m=+1011.229833894" watchObservedRunningTime="2025-12-03 14:30:04.24231087 +0000 UTC m=+1011.230666117" Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.670672 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-memberlist\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.690904 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3afabeea-33c4-4bed-a2ca-440c78ff75ad-memberlist\") pod \"speaker-rgj2q\" (UID: \"3afabeea-33c4-4bed-a2ca-440c78ff75ad\") " pod="metallb-system/speaker-rgj2q" Dec 03 14:30:04 crc kubenswrapper[4751]: I1203 14:30:04.813922 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rgj2q" Dec 03 14:30:04 crc kubenswrapper[4751]: W1203 14:30:04.839587 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3afabeea_33c4_4bed_a2ca_440c78ff75ad.slice/crio-6dd351e7882ec3179cda2777589b908d5f1d1f6f75c9706e1e4b5b4bde0df583 WatchSource:0}: Error finding container 6dd351e7882ec3179cda2777589b908d5f1d1f6f75c9706e1e4b5b4bde0df583: Status 404 returned error can't find the container with id 6dd351e7882ec3179cda2777589b908d5f1d1f6f75c9706e1e4b5b4bde0df583 Dec 03 14:30:05 crc kubenswrapper[4751]: I1203 14:30:05.255429 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rgj2q" event={"ID":"3afabeea-33c4-4bed-a2ca-440c78ff75ad","Type":"ContainerStarted","Data":"4dc087811a8186fde508d72ba6cfeae9adb5b5c7dbb0301d64de28398e5beb10"} Dec 03 14:30:05 crc kubenswrapper[4751]: I1203 14:30:05.255488 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rgj2q" event={"ID":"3afabeea-33c4-4bed-a2ca-440c78ff75ad","Type":"ContainerStarted","Data":"6dd351e7882ec3179cda2777589b908d5f1d1f6f75c9706e1e4b5b4bde0df583"} Dec 03 14:30:05 crc kubenswrapper[4751]: I1203 14:30:05.820295 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:30:05 crc kubenswrapper[4751]: I1203 14:30:05.820709 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:30:06 crc kubenswrapper[4751]: I1203 14:30:06.264205 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rgj2q" event={"ID":"3afabeea-33c4-4bed-a2ca-440c78ff75ad","Type":"ContainerStarted","Data":"a0d1fe5d3e294d64fdae427faf9566d7cacab1e43bdcf095eec1142e49447d3a"} Dec 03 14:30:06 crc kubenswrapper[4751]: I1203 14:30:06.264682 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-rgj2q" Dec 03 14:30:06 crc kubenswrapper[4751]: I1203 14:30:06.281889 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-rgj2q" podStartSLOduration=4.281872935 podStartE2EDuration="4.281872935s" podCreationTimestamp="2025-12-03 14:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:30:06.27876232 +0000 UTC m=+1013.267117547" watchObservedRunningTime="2025-12-03 14:30:06.281872935 +0000 UTC m=+1013.270228152" Dec 03 14:30:11 crc kubenswrapper[4751]: I1203 14:30:11.307836 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" event={"ID":"0d711f1f-ab39-4d20-951b-398bd5c7226c","Type":"ContainerStarted","Data":"6600e1b2eeb786f5a62f6fc7929ec7b9fd303303f307c78825fc31d5b1dbd3dc"} Dec 03 14:30:11 crc kubenswrapper[4751]: I1203 14:30:11.308278 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:11 crc kubenswrapper[4751]: I1203 14:30:11.309802 4751 generic.go:334] "Generic (PLEG): container finished" podID="20dc27ab-4ebf-46d7-8f6e-8c703e66fa92" containerID="02ddbd3e032c403ecd5e08b8701d0e7809b27861aa118cd3da6e9cbdbf4ace19" exitCode=0 Dec 03 14:30:11 crc kubenswrapper[4751]: I1203 14:30:11.309848 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerDied","Data":"02ddbd3e032c403ecd5e08b8701d0e7809b27861aa118cd3da6e9cbdbf4ace19"} Dec 03 14:30:11 crc kubenswrapper[4751]: I1203 14:30:11.328991 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" podStartSLOduration=2.166138359 podStartE2EDuration="9.328912839s" podCreationTimestamp="2025-12-03 14:30:02 +0000 UTC" firstStartedPulling="2025-12-03 14:30:03.697185868 +0000 UTC m=+1010.685541085" lastFinishedPulling="2025-12-03 14:30:10.859960348 +0000 UTC m=+1017.848315565" observedRunningTime="2025-12-03 14:30:11.323963354 +0000 UTC m=+1018.312318571" watchObservedRunningTime="2025-12-03 14:30:11.328912839 +0000 UTC m=+1018.317268056" Dec 03 14:30:12 crc kubenswrapper[4751]: I1203 14:30:12.317151 4751 generic.go:334] "Generic (PLEG): container finished" podID="20dc27ab-4ebf-46d7-8f6e-8c703e66fa92" containerID="3de08618bd8e37e4d33e130651b83a92c9028e64d3a521b88b9fa5ada97b515d" exitCode=0 Dec 03 14:30:12 crc kubenswrapper[4751]: I1203 14:30:12.317648 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerDied","Data":"3de08618bd8e37e4d33e130651b83a92c9028e64d3a521b88b9fa5ada97b515d"} Dec 03 14:30:13 crc kubenswrapper[4751]: I1203 14:30:13.325739 4751 generic.go:334] "Generic (PLEG): container finished" podID="20dc27ab-4ebf-46d7-8f6e-8c703e66fa92" containerID="524de3eb78c6cedccf24cec07087337f1753a00232b86988b581bd97671c025e" exitCode=0 Dec 03 14:30:13 crc kubenswrapper[4751]: I1203 14:30:13.325974 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerDied","Data":"524de3eb78c6cedccf24cec07087337f1753a00232b86988b581bd97671c025e"} Dec 03 14:30:13 crc kubenswrapper[4751]: I1203 14:30:13.370252 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-92ghf" Dec 03 14:30:14 crc kubenswrapper[4751]: I1203 14:30:14.335969 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerStarted","Data":"2bcc131464f89ef5994583681a868d8700d27cbfd0109e1c153f28baf1ca5a3e"} Dec 03 14:30:14 crc kubenswrapper[4751]: I1203 14:30:14.336319 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerStarted","Data":"4dea52feef152518ac39aacbf9eeb36cca4fdfa8a9d3c74bafddb52c47d32fcd"} Dec 03 14:30:14 crc kubenswrapper[4751]: I1203 14:30:14.336370 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerStarted","Data":"609efded8ca7af0f3a6c28a32ecbb5ac479ce21118db4ef8e5b258c0aaaf396f"} Dec 03 14:30:14 crc kubenswrapper[4751]: I1203 14:30:14.336381 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerStarted","Data":"7d5445cbd298d66da3e6f12554574fa1d2dab2e4fce26276642dc98ecd3183e6"} Dec 03 14:30:14 crc kubenswrapper[4751]: I1203 14:30:14.336390 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerStarted","Data":"db0285645552c2c934a78396a50c70da41d0e392ceb78f0a24fcfe094dfed5a5"} Dec 03 14:30:15 crc kubenswrapper[4751]: I1203 14:30:15.346610 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4fx8c" event={"ID":"20dc27ab-4ebf-46d7-8f6e-8c703e66fa92","Type":"ContainerStarted","Data":"f3e96036e6e98e5f9e8f264f2946495436f2914a836fb87e4e1b419e008a50c9"} Dec 03 14:30:15 crc kubenswrapper[4751]: I1203 14:30:15.346772 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:15 crc kubenswrapper[4751]: I1203 14:30:15.376102 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-4fx8c" podStartSLOduration=5.951888885 podStartE2EDuration="13.376085147s" podCreationTimestamp="2025-12-03 14:30:02 +0000 UTC" firstStartedPulling="2025-12-03 14:30:03.435503139 +0000 UTC m=+1010.423858356" lastFinishedPulling="2025-12-03 14:30:10.859699401 +0000 UTC m=+1017.848054618" observedRunningTime="2025-12-03 14:30:15.37508592 +0000 UTC m=+1022.363441137" watchObservedRunningTime="2025-12-03 14:30:15.376085147 +0000 UTC m=+1022.364440374" Dec 03 14:30:18 crc kubenswrapper[4751]: I1203 14:30:18.320961 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:18 crc kubenswrapper[4751]: I1203 14:30:18.396100 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:23 crc kubenswrapper[4751]: I1203 14:30:23.243892 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-4fx8c" Dec 03 14:30:23 crc kubenswrapper[4751]: I1203 14:30:23.261017 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9zqk5" Dec 03 14:30:24 crc kubenswrapper[4751]: I1203 14:30:24.818490 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-rgj2q" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.675189 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hvw5d"] Dec 03 14:30:27 crc kubenswrapper[4751]: E1203 14:30:27.675714 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fff4ad-5068-4b2b-a2ad-b700d3dc06e1" containerName="collect-profiles" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.675728 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fff4ad-5068-4b2b-a2ad-b700d3dc06e1" containerName="collect-profiles" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.675854 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fff4ad-5068-4b2b-a2ad-b700d3dc06e1" containerName="collect-profiles" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.676351 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvw5d" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.678130 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-j48j5" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.678579 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.690222 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hvw5d"] Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.690404 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.757906 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sft7k\" (UniqueName: \"kubernetes.io/projected/d65c2e9b-7620-41fb-af12-60e0eb8caa39-kube-api-access-sft7k\") pod \"openstack-operator-index-hvw5d\" (UID: \"d65c2e9b-7620-41fb-af12-60e0eb8caa39\") " pod="openstack-operators/openstack-operator-index-hvw5d" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.859651 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sft7k\" (UniqueName: \"kubernetes.io/projected/d65c2e9b-7620-41fb-af12-60e0eb8caa39-kube-api-access-sft7k\") pod \"openstack-operator-index-hvw5d\" (UID: \"d65c2e9b-7620-41fb-af12-60e0eb8caa39\") " pod="openstack-operators/openstack-operator-index-hvw5d" Dec 03 14:30:27 crc kubenswrapper[4751]: I1203 14:30:27.882522 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sft7k\" (UniqueName: \"kubernetes.io/projected/d65c2e9b-7620-41fb-af12-60e0eb8caa39-kube-api-access-sft7k\") pod \"openstack-operator-index-hvw5d\" (UID: \"d65c2e9b-7620-41fb-af12-60e0eb8caa39\") " pod="openstack-operators/openstack-operator-index-hvw5d" Dec 03 14:30:28 crc kubenswrapper[4751]: I1203 14:30:28.028616 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvw5d" Dec 03 14:30:28 crc kubenswrapper[4751]: I1203 14:30:28.435508 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hvw5d"] Dec 03 14:30:29 crc kubenswrapper[4751]: I1203 14:30:29.444299 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvw5d" event={"ID":"d65c2e9b-7620-41fb-af12-60e0eb8caa39","Type":"ContainerStarted","Data":"0011d7083230a0a5e9d99456ad2c0313ebc1c61efae58de4b2947cd18cfe5faf"} Dec 03 14:30:30 crc kubenswrapper[4751]: I1203 14:30:30.853600 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hvw5d"] Dec 03 14:30:31 crc kubenswrapper[4751]: I1203 14:30:31.472248 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jqt9t"] Dec 03 14:30:31 crc kubenswrapper[4751]: I1203 14:30:31.474051 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jqt9t" Dec 03 14:30:31 crc kubenswrapper[4751]: I1203 14:30:31.479662 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jqt9t"] Dec 03 14:30:31 crc kubenswrapper[4751]: I1203 14:30:31.510712 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh65p\" (UniqueName: \"kubernetes.io/projected/8a1c208b-28d2-4d51-a98e-ffece8c3d11e-kube-api-access-kh65p\") pod \"openstack-operator-index-jqt9t\" (UID: \"8a1c208b-28d2-4d51-a98e-ffece8c3d11e\") " pod="openstack-operators/openstack-operator-index-jqt9t" Dec 03 14:30:31 crc kubenswrapper[4751]: I1203 14:30:31.611666 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh65p\" (UniqueName: \"kubernetes.io/projected/8a1c208b-28d2-4d51-a98e-ffece8c3d11e-kube-api-access-kh65p\") pod \"openstack-operator-index-jqt9t\" (UID: \"8a1c208b-28d2-4d51-a98e-ffece8c3d11e\") " pod="openstack-operators/openstack-operator-index-jqt9t" Dec 03 14:30:31 crc kubenswrapper[4751]: I1203 14:30:31.630444 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh65p\" (UniqueName: \"kubernetes.io/projected/8a1c208b-28d2-4d51-a98e-ffece8c3d11e-kube-api-access-kh65p\") pod \"openstack-operator-index-jqt9t\" (UID: \"8a1c208b-28d2-4d51-a98e-ffece8c3d11e\") " pod="openstack-operators/openstack-operator-index-jqt9t" Dec 03 14:30:31 crc kubenswrapper[4751]: I1203 14:30:31.795902 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jqt9t" Dec 03 14:30:32 crc kubenswrapper[4751]: I1203 14:30:32.641005 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jqt9t"] Dec 03 14:30:32 crc kubenswrapper[4751]: W1203 14:30:32.765518 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a1c208b_28d2_4d51_a98e_ffece8c3d11e.slice/crio-4469da45e65e2b2b630759673bccafd2933e4bd4075c133eea57dfe569bb7b56 WatchSource:0}: Error finding container 4469da45e65e2b2b630759673bccafd2933e4bd4075c133eea57dfe569bb7b56: Status 404 returned error can't find the container with id 4469da45e65e2b2b630759673bccafd2933e4bd4075c133eea57dfe569bb7b56 Dec 03 14:30:33 crc kubenswrapper[4751]: I1203 14:30:33.501255 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jqt9t" event={"ID":"8a1c208b-28d2-4d51-a98e-ffece8c3d11e","Type":"ContainerStarted","Data":"4469da45e65e2b2b630759673bccafd2933e4bd4075c133eea57dfe569bb7b56"} Dec 03 14:30:34 crc kubenswrapper[4751]: I1203 14:30:34.511175 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jqt9t" event={"ID":"8a1c208b-28d2-4d51-a98e-ffece8c3d11e","Type":"ContainerStarted","Data":"6d78347e5f77ce990e934371543777852ae086d94be70cc06c9f1cff60d92fbc"} Dec 03 14:30:34 crc kubenswrapper[4751]: I1203 14:30:34.513102 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvw5d" event={"ID":"d65c2e9b-7620-41fb-af12-60e0eb8caa39","Type":"ContainerStarted","Data":"4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76"} Dec 03 14:30:34 crc kubenswrapper[4751]: I1203 14:30:34.513289 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hvw5d" podUID="d65c2e9b-7620-41fb-af12-60e0eb8caa39" containerName="registry-server" containerID="cri-o://4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76" gracePeriod=2 Dec 03 14:30:34 crc kubenswrapper[4751]: I1203 14:30:34.529544 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jqt9t" podStartSLOduration=3.00487028 podStartE2EDuration="3.529522206s" podCreationTimestamp="2025-12-03 14:30:31 +0000 UTC" firstStartedPulling="2025-12-03 14:30:32.766965227 +0000 UTC m=+1039.755320444" lastFinishedPulling="2025-12-03 14:30:33.291617153 +0000 UTC m=+1040.279972370" observedRunningTime="2025-12-03 14:30:34.5271354 +0000 UTC m=+1041.515490627" watchObservedRunningTime="2025-12-03 14:30:34.529522206 +0000 UTC m=+1041.517877433" Dec 03 14:30:34 crc kubenswrapper[4751]: I1203 14:30:34.549289 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hvw5d" podStartSLOduration=2.702802271 podStartE2EDuration="7.54926806s" podCreationTimestamp="2025-12-03 14:30:27 +0000 UTC" firstStartedPulling="2025-12-03 14:30:28.443359614 +0000 UTC m=+1035.431714831" lastFinishedPulling="2025-12-03 14:30:33.289825403 +0000 UTC m=+1040.278180620" observedRunningTime="2025-12-03 14:30:34.54891024 +0000 UTC m=+1041.537265457" watchObservedRunningTime="2025-12-03 14:30:34.54926806 +0000 UTC m=+1041.537623277" Dec 03 14:30:34 crc kubenswrapper[4751]: I1203 14:30:34.914444 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvw5d" Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.055858 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sft7k\" (UniqueName: \"kubernetes.io/projected/d65c2e9b-7620-41fb-af12-60e0eb8caa39-kube-api-access-sft7k\") pod \"d65c2e9b-7620-41fb-af12-60e0eb8caa39\" (UID: \"d65c2e9b-7620-41fb-af12-60e0eb8caa39\") " Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.063988 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65c2e9b-7620-41fb-af12-60e0eb8caa39-kube-api-access-sft7k" (OuterVolumeSpecName: "kube-api-access-sft7k") pod "d65c2e9b-7620-41fb-af12-60e0eb8caa39" (UID: "d65c2e9b-7620-41fb-af12-60e0eb8caa39"). InnerVolumeSpecName "kube-api-access-sft7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.157429 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sft7k\" (UniqueName: \"kubernetes.io/projected/d65c2e9b-7620-41fb-af12-60e0eb8caa39-kube-api-access-sft7k\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.520382 4751 generic.go:334] "Generic (PLEG): container finished" podID="d65c2e9b-7620-41fb-af12-60e0eb8caa39" containerID="4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76" exitCode=0 Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.520437 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvw5d" Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.520454 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvw5d" event={"ID":"d65c2e9b-7620-41fb-af12-60e0eb8caa39","Type":"ContainerDied","Data":"4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76"} Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.521099 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvw5d" event={"ID":"d65c2e9b-7620-41fb-af12-60e0eb8caa39","Type":"ContainerDied","Data":"0011d7083230a0a5e9d99456ad2c0313ebc1c61efae58de4b2947cd18cfe5faf"} Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.521121 4751 scope.go:117] "RemoveContainer" containerID="4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76" Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.543048 4751 scope.go:117] "RemoveContainer" containerID="4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76" Dec 03 14:30:35 crc kubenswrapper[4751]: E1203 14:30:35.543474 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76\": container with ID starting with 4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76 not found: ID does not exist" containerID="4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76" Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.543557 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76"} err="failed to get container status \"4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76\": rpc error: code = NotFound desc = could not find container \"4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76\": container with ID starting with 4e3e02342b9952b0bda57e8e1ba62cfe23d540405ba0d885de2fd4a5ec95ff76 not found: ID does not exist" Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.548793 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hvw5d"] Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.554748 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hvw5d"] Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.822257 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:30:35 crc kubenswrapper[4751]: I1203 14:30:35.822754 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:30:37 crc kubenswrapper[4751]: I1203 14:30:37.323792 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65c2e9b-7620-41fb-af12-60e0eb8caa39" path="/var/lib/kubelet/pods/d65c2e9b-7620-41fb-af12-60e0eb8caa39/volumes" Dec 03 14:30:41 crc kubenswrapper[4751]: I1203 14:30:41.796418 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-jqt9t" Dec 03 14:30:41 crc kubenswrapper[4751]: I1203 14:30:41.797084 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-jqt9t" Dec 03 14:30:41 crc kubenswrapper[4751]: I1203 14:30:41.830042 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-jqt9t" Dec 03 14:30:42 crc kubenswrapper[4751]: I1203 14:30:42.591658 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-jqt9t" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.671905 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg"] Dec 03 14:30:50 crc kubenswrapper[4751]: E1203 14:30:50.672646 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65c2e9b-7620-41fb-af12-60e0eb8caa39" containerName="registry-server" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.672658 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65c2e9b-7620-41fb-af12-60e0eb8caa39" containerName="registry-server" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.672793 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65c2e9b-7620-41fb-af12-60e0eb8caa39" containerName="registry-server" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.673593 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.675666 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lprjz" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.686291 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg"] Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.864925 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppkkn\" (UniqueName: \"kubernetes.io/projected/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-kube-api-access-ppkkn\") pod \"a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.864997 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-bundle\") pod \"a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.865273 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-util\") pod \"a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.966984 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-bundle\") pod \"a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.967423 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-util\") pod \"a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.967591 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkkn\" (UniqueName: \"kubernetes.io/projected/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-kube-api-access-ppkkn\") pod \"a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.967621 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-bundle\") pod \"a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.967973 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-util\") pod \"a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.991364 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppkkn\" (UniqueName: \"kubernetes.io/projected/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-kube-api-access-ppkkn\") pod \"a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:50 crc kubenswrapper[4751]: I1203 14:30:50.991662 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:51 crc kubenswrapper[4751]: I1203 14:30:51.393618 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg"] Dec 03 14:30:51 crc kubenswrapper[4751]: I1203 14:30:51.623873 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" event={"ID":"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0","Type":"ContainerStarted","Data":"0fb0832d0b9b5aa5b08cb531ec754830f544c76990cc23c141b06c60dfd948d7"} Dec 03 14:30:53 crc kubenswrapper[4751]: I1203 14:30:53.658525 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" event={"ID":"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0","Type":"ContainerDied","Data":"f54574929db1ca4d7767f325c710fd87d1b0d0650b6d883a84c61dee0a2a8187"} Dec 03 14:30:53 crc kubenswrapper[4751]: I1203 14:30:53.658213 4751 generic.go:334] "Generic (PLEG): container finished" podID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerID="f54574929db1ca4d7767f325c710fd87d1b0d0650b6d883a84c61dee0a2a8187" exitCode=0 Dec 03 14:30:53 crc kubenswrapper[4751]: I1203 14:30:53.662754 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:30:56 crc kubenswrapper[4751]: I1203 14:30:56.692823 4751 generic.go:334] "Generic (PLEG): container finished" podID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerID="ebe01598f8a95b99d6c7ca3e01ace7f1b5da7561a7b605b553b3a9f7cb97acf5" exitCode=0 Dec 03 14:30:56 crc kubenswrapper[4751]: I1203 14:30:56.693043 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" event={"ID":"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0","Type":"ContainerDied","Data":"ebe01598f8a95b99d6c7ca3e01ace7f1b5da7561a7b605b553b3a9f7cb97acf5"} Dec 03 14:30:57 crc kubenswrapper[4751]: I1203 14:30:57.702780 4751 generic.go:334] "Generic (PLEG): container finished" podID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerID="27a140e0d07c2f5ec04651a3f28e77bf2f057df9686fa8f6c34aaddfcca7f0e0" exitCode=0 Dec 03 14:30:57 crc kubenswrapper[4751]: I1203 14:30:57.702952 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" event={"ID":"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0","Type":"ContainerDied","Data":"27a140e0d07c2f5ec04651a3f28e77bf2f057df9686fa8f6c34aaddfcca7f0e0"} Dec 03 14:30:58 crc kubenswrapper[4751]: I1203 14:30:58.975642 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:30:58 crc kubenswrapper[4751]: I1203 14:30:58.981309 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppkkn\" (UniqueName: \"kubernetes.io/projected/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-kube-api-access-ppkkn\") pod \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " Dec 03 14:30:58 crc kubenswrapper[4751]: I1203 14:30:58.981486 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-bundle\") pod \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " Dec 03 14:30:58 crc kubenswrapper[4751]: I1203 14:30:58.981602 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-util\") pod \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\" (UID: \"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0\") " Dec 03 14:30:58 crc kubenswrapper[4751]: I1203 14:30:58.983651 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-bundle" (OuterVolumeSpecName: "bundle") pod "86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" (UID: "86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:58 crc kubenswrapper[4751]: I1203 14:30:58.986605 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-kube-api-access-ppkkn" (OuterVolumeSpecName: "kube-api-access-ppkkn") pod "86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" (UID: "86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0"). InnerVolumeSpecName "kube-api-access-ppkkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:59 crc kubenswrapper[4751]: I1203 14:30:59.003991 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-util" (OuterVolumeSpecName: "util") pod "86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" (UID: "86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:59 crc kubenswrapper[4751]: I1203 14:30:59.083111 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:59 crc kubenswrapper[4751]: I1203 14:30:59.083378 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-util\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:59 crc kubenswrapper[4751]: I1203 14:30:59.083390 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppkkn\" (UniqueName: \"kubernetes.io/projected/86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0-kube-api-access-ppkkn\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:59 crc kubenswrapper[4751]: I1203 14:30:59.720833 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" event={"ID":"86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0","Type":"ContainerDied","Data":"0fb0832d0b9b5aa5b08cb531ec754830f544c76990cc23c141b06c60dfd948d7"} Dec 03 14:30:59 crc kubenswrapper[4751]: I1203 14:30:59.720891 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fb0832d0b9b5aa5b08cb531ec754830f544c76990cc23c141b06c60dfd948d7" Dec 03 14:30:59 crc kubenswrapper[4751]: I1203 14:30:59.720911 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.350463 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p"] Dec 03 14:31:03 crc kubenswrapper[4751]: E1203 14:31:03.351064 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerName="util" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.351079 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerName="util" Dec 03 14:31:03 crc kubenswrapper[4751]: E1203 14:31:03.351099 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerName="pull" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.351107 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerName="pull" Dec 03 14:31:03 crc kubenswrapper[4751]: E1203 14:31:03.351125 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerName="extract" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.351133 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerName="extract" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.351271 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0" containerName="extract" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.351832 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.356222 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-m9r8z" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.384773 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p"] Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.445821 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhg9f\" (UniqueName: \"kubernetes.io/projected/6aeb43b5-b817-4d39-81de-bc6f27afb55b-kube-api-access-mhg9f\") pod \"openstack-operator-controller-operator-698cb7586c-qft9p\" (UID: \"6aeb43b5-b817-4d39-81de-bc6f27afb55b\") " pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.546914 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhg9f\" (UniqueName: \"kubernetes.io/projected/6aeb43b5-b817-4d39-81de-bc6f27afb55b-kube-api-access-mhg9f\") pod \"openstack-operator-controller-operator-698cb7586c-qft9p\" (UID: \"6aeb43b5-b817-4d39-81de-bc6f27afb55b\") " pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.573545 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhg9f\" (UniqueName: \"kubernetes.io/projected/6aeb43b5-b817-4d39-81de-bc6f27afb55b-kube-api-access-mhg9f\") pod \"openstack-operator-controller-operator-698cb7586c-qft9p\" (UID: \"6aeb43b5-b817-4d39-81de-bc6f27afb55b\") " pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" Dec 03 14:31:03 crc kubenswrapper[4751]: I1203 14:31:03.671590 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" Dec 03 14:31:04 crc kubenswrapper[4751]: I1203 14:31:04.136293 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p"] Dec 03 14:31:04 crc kubenswrapper[4751]: I1203 14:31:04.764114 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" event={"ID":"6aeb43b5-b817-4d39-81de-bc6f27afb55b","Type":"ContainerStarted","Data":"e234f8d2a5262ce6001642c821cdf2cd76234dd8f1500617cb7a5ecc68fbd1b3"} Dec 03 14:31:05 crc kubenswrapper[4751]: I1203 14:31:05.820125 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:31:05 crc kubenswrapper[4751]: I1203 14:31:05.820477 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:31:05 crc kubenswrapper[4751]: I1203 14:31:05.820529 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:31:05 crc kubenswrapper[4751]: I1203 14:31:05.821141 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6554aa9d5e7898bf5e07fe04c4800b61a41046f1b56d94f87ff1d09b45063fa3"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:31:05 crc kubenswrapper[4751]: I1203 14:31:05.821206 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://6554aa9d5e7898bf5e07fe04c4800b61a41046f1b56d94f87ff1d09b45063fa3" gracePeriod=600 Dec 03 14:31:06 crc kubenswrapper[4751]: I1203 14:31:06.777541 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="6554aa9d5e7898bf5e07fe04c4800b61a41046f1b56d94f87ff1d09b45063fa3" exitCode=0 Dec 03 14:31:06 crc kubenswrapper[4751]: I1203 14:31:06.777599 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"6554aa9d5e7898bf5e07fe04c4800b61a41046f1b56d94f87ff1d09b45063fa3"} Dec 03 14:31:06 crc kubenswrapper[4751]: I1203 14:31:06.777646 4751 scope.go:117] "RemoveContainer" containerID="022494d9f3dab8e8955cedfdb1fecb645926d841488965605deafc394884f056" Dec 03 14:31:10 crc kubenswrapper[4751]: I1203 14:31:10.806049 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"013b499465da11b00f7b510304fcaff215703026384eae17787f3651933e4e4f"} Dec 03 14:31:12 crc kubenswrapper[4751]: I1203 14:31:12.820023 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" event={"ID":"6aeb43b5-b817-4d39-81de-bc6f27afb55b","Type":"ContainerStarted","Data":"ad59c7ac46a9df0e693ee37f75dd2efdc2ee654648d2de3a74edbde97d611bc5"} Dec 03 14:31:12 crc kubenswrapper[4751]: I1203 14:31:12.820751 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" Dec 03 14:31:12 crc kubenswrapper[4751]: I1203 14:31:12.849744 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" podStartSLOduration=3.367999409 podStartE2EDuration="9.849720692s" podCreationTimestamp="2025-12-03 14:31:03 +0000 UTC" firstStartedPulling="2025-12-03 14:31:04.145622879 +0000 UTC m=+1071.133978096" lastFinishedPulling="2025-12-03 14:31:10.627344162 +0000 UTC m=+1077.615699379" observedRunningTime="2025-12-03 14:31:12.84710104 +0000 UTC m=+1079.835456277" watchObservedRunningTime="2025-12-03 14:31:12.849720692 +0000 UTC m=+1079.838075929" Dec 03 14:31:23 crc kubenswrapper[4751]: I1203 14:31:23.676244 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-698cb7586c-qft9p" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.664848 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.668694 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.671309 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-d68bc" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.713429 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.714772 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.720827 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-nxxmx" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.723977 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.735405 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.736808 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.741415 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.746689 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-v2cwm" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.758392 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.779311 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll5m5\" (UniqueName: \"kubernetes.io/projected/b112bf8e-175b-4bc3-9840-6d134b4a1bce-kube-api-access-ll5m5\") pod \"cinder-operator-controller-manager-859b6ccc6-7422h\" (UID: \"b112bf8e-175b-4bc3-9840-6d134b4a1bce\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.785159 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.786684 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.789683 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mbmqm" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.799389 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.800448 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.809411 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xgwf9" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.825298 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.862962 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.881563 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkr5q\" (UniqueName: \"kubernetes.io/projected/ff012f7f-3431-472a-8b44-1fa7a47e74e1-kube-api-access-vkr5q\") pod \"heat-operator-controller-manager-5f64f6f8bb-m4q9z\" (UID: \"ff012f7f-3431-472a-8b44-1fa7a47e74e1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.881608 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6qf\" (UniqueName: \"kubernetes.io/projected/26689286-a791-485e-b442-9e399ae7a79b-kube-api-access-5c6qf\") pod \"barbican-operator-controller-manager-7d9dfd778-qswcx\" (UID: \"26689286-a791-485e-b442-9e399ae7a79b\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.881656 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mndwk\" (UniqueName: \"kubernetes.io/projected/a975003d-b7d2-4a95-8571-571bc082021d-kube-api-access-mndwk\") pod \"glance-operator-controller-manager-77987cd8cd-6shnx\" (UID: \"a975003d-b7d2-4a95-8571-571bc082021d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.881678 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbgkj\" (UniqueName: \"kubernetes.io/projected/618af04c-a37d-4d21-bdba-345c9a63be07-kube-api-access-bbgkj\") pod \"designate-operator-controller-manager-78b4bc895b-n8td8\" (UID: \"618af04c-a37d-4d21-bdba-345c9a63be07\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.881714 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll5m5\" (UniqueName: \"kubernetes.io/projected/b112bf8e-175b-4bc3-9840-6d134b4a1bce-kube-api-access-ll5m5\") pod \"cinder-operator-controller-manager-859b6ccc6-7422h\" (UID: \"b112bf8e-175b-4bc3-9840-6d134b4a1bce\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.889390 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.890458 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.893463 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-q7rjk" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.894483 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.913390 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-ppb75"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.914386 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.917926 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-h4wj7" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.918096 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.918718 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.919623 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.923079 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll5m5\" (UniqueName: \"kubernetes.io/projected/b112bf8e-175b-4bc3-9840-6d134b4a1bce-kube-api-access-ll5m5\") pod \"cinder-operator-controller-manager-859b6ccc6-7422h\" (UID: \"b112bf8e-175b-4bc3-9840-6d134b4a1bce\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.926756 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-ppb75"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.926888 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6nq6z" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.938477 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.956272 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.965718 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.965817 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.980710 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7xcb6" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.983358 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mndwk\" (UniqueName: \"kubernetes.io/projected/a975003d-b7d2-4a95-8571-571bc082021d-kube-api-access-mndwk\") pod \"glance-operator-controller-manager-77987cd8cd-6shnx\" (UID: \"a975003d-b7d2-4a95-8571-571bc082021d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.983396 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbgkj\" (UniqueName: \"kubernetes.io/projected/618af04c-a37d-4d21-bdba-345c9a63be07-kube-api-access-bbgkj\") pod \"designate-operator-controller-manager-78b4bc895b-n8td8\" (UID: \"618af04c-a37d-4d21-bdba-345c9a63be07\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.983471 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkr5q\" (UniqueName: \"kubernetes.io/projected/ff012f7f-3431-472a-8b44-1fa7a47e74e1-kube-api-access-vkr5q\") pod \"heat-operator-controller-manager-5f64f6f8bb-m4q9z\" (UID: \"ff012f7f-3431-472a-8b44-1fa7a47e74e1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.983493 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6qf\" (UniqueName: \"kubernetes.io/projected/26689286-a791-485e-b442-9e399ae7a79b-kube-api-access-5c6qf\") pod \"barbican-operator-controller-manager-7d9dfd778-qswcx\" (UID: \"26689286-a791-485e-b442-9e399ae7a79b\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.983531 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rbm4\" (UniqueName: \"kubernetes.io/projected/b5d6b394-fe97-4e70-9916-9c6791379931-kube-api-access-2rbm4\") pod \"horizon-operator-controller-manager-68c6d99b8f-8hhdk\" (UID: \"b5d6b394-fe97-4e70-9916-9c6791379931\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.993380 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr"] Dec 03 14:31:49 crc kubenswrapper[4751]: I1203 14:31:49.994472 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.003687 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6pz6l" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.011399 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.045238 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mndwk\" (UniqueName: \"kubernetes.io/projected/a975003d-b7d2-4a95-8571-571bc082021d-kube-api-access-mndwk\") pod \"glance-operator-controller-manager-77987cd8cd-6shnx\" (UID: \"a975003d-b7d2-4a95-8571-571bc082021d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.045702 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6qf\" (UniqueName: \"kubernetes.io/projected/26689286-a791-485e-b442-9e399ae7a79b-kube-api-access-5c6qf\") pod \"barbican-operator-controller-manager-7d9dfd778-qswcx\" (UID: \"26689286-a791-485e-b442-9e399ae7a79b\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.045757 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkr5q\" (UniqueName: \"kubernetes.io/projected/ff012f7f-3431-472a-8b44-1fa7a47e74e1-kube-api-access-vkr5q\") pod \"heat-operator-controller-manager-5f64f6f8bb-m4q9z\" (UID: \"ff012f7f-3431-472a-8b44-1fa7a47e74e1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.047307 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.048763 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.052713 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hpdpz" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.065616 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.067906 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbgkj\" (UniqueName: \"kubernetes.io/projected/618af04c-a37d-4d21-bdba-345c9a63be07-kube-api-access-bbgkj\") pod \"designate-operator-controller-manager-78b4bc895b-n8td8\" (UID: \"618af04c-a37d-4d21-bdba-345c9a63be07\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.088382 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.093375 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzbnn\" (UniqueName: \"kubernetes.io/projected/2f31e262-8f03-4689-bc29-5d9d8b33a2cc-kube-api-access-xzbnn\") pod \"ironic-operator-controller-manager-6c548fd776-pxjk5\" (UID: \"2f31e262-8f03-4689-bc29-5d9d8b33a2cc\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.093419 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt594\" (UniqueName: \"kubernetes.io/projected/7f29786e-1f3c-4c92-81ac-4b6110cf03a3-kube-api-access-gt594\") pod \"keystone-operator-controller-manager-7765d96ddf-wgjr8\" (UID: \"7f29786e-1f3c-4c92-81ac-4b6110cf03a3\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.093439 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrdf\" (UniqueName: \"kubernetes.io/projected/a54985ea-4d23-4a65-bd1a-1c9d059ea206-kube-api-access-rvrdf\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.093462 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rbm4\" (UniqueName: \"kubernetes.io/projected/b5d6b394-fe97-4e70-9916-9c6791379931-kube-api-access-2rbm4\") pod \"horizon-operator-controller-manager-68c6d99b8f-8hhdk\" (UID: \"b5d6b394-fe97-4e70-9916-9c6791379931\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.093524 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.093546 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxzp\" (UniqueName: \"kubernetes.io/projected/8adbadf1-f21d-4a09-acf7-d44a87bee356-kube-api-access-dgxzp\") pod \"manila-operator-controller-manager-7c79b5df47-zjnwr\" (UID: \"8adbadf1-f21d-4a09-acf7-d44a87bee356\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.117140 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.123443 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.124699 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.135537 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-z2vqh" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.138181 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rbm4\" (UniqueName: \"kubernetes.io/projected/b5d6b394-fe97-4e70-9916-9c6791379931-kube-api-access-2rbm4\") pod \"horizon-operator-controller-manager-68c6d99b8f-8hhdk\" (UID: \"b5d6b394-fe97-4e70-9916-9c6791379931\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.140859 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.148544 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.157390 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.158483 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.177979 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-t26wm" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.197343 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phn2h\" (UniqueName: \"kubernetes.io/projected/b4cb50e3-a93e-49b0-ac9c-6551046dc0be-kube-api-access-phn2h\") pod \"mariadb-operator-controller-manager-56bbcc9d85-4nvpk\" (UID: \"b4cb50e3-a93e-49b0-ac9c-6551046dc0be\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.197400 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.197427 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxzp\" (UniqueName: \"kubernetes.io/projected/8adbadf1-f21d-4a09-acf7-d44a87bee356-kube-api-access-dgxzp\") pod \"manila-operator-controller-manager-7c79b5df47-zjnwr\" (UID: \"8adbadf1-f21d-4a09-acf7-d44a87bee356\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.197460 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzbnn\" (UniqueName: \"kubernetes.io/projected/2f31e262-8f03-4689-bc29-5d9d8b33a2cc-kube-api-access-xzbnn\") pod \"ironic-operator-controller-manager-6c548fd776-pxjk5\" (UID: \"2f31e262-8f03-4689-bc29-5d9d8b33a2cc\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.197478 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xhrx\" (UniqueName: \"kubernetes.io/projected/4cd34243-8404-4cf7-9185-c012700b5814-kube-api-access-4xhrx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-sxl98\" (UID: \"4cd34243-8404-4cf7-9185-c012700b5814\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.197510 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt594\" (UniqueName: \"kubernetes.io/projected/7f29786e-1f3c-4c92-81ac-4b6110cf03a3-kube-api-access-gt594\") pod \"keystone-operator-controller-manager-7765d96ddf-wgjr8\" (UID: \"7f29786e-1f3c-4c92-81ac-4b6110cf03a3\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.197534 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrdf\" (UniqueName: \"kubernetes.io/projected/a54985ea-4d23-4a65-bd1a-1c9d059ea206-kube-api-access-rvrdf\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.197858 4751 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.197896 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert podName:a54985ea-4d23-4a65-bd1a-1c9d059ea206 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:50.697883582 +0000 UTC m=+1117.686238799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert") pod "infra-operator-controller-manager-57548d458d-ppb75" (UID: "a54985ea-4d23-4a65-bd1a-1c9d059ea206") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.226375 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.240264 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-8h62v"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.280357 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzbnn\" (UniqueName: \"kubernetes.io/projected/2f31e262-8f03-4689-bc29-5d9d8b33a2cc-kube-api-access-xzbnn\") pod \"ironic-operator-controller-manager-6c548fd776-pxjk5\" (UID: \"2f31e262-8f03-4689-bc29-5d9d8b33a2cc\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.289557 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxzp\" (UniqueName: \"kubernetes.io/projected/8adbadf1-f21d-4a09-acf7-d44a87bee356-kube-api-access-dgxzp\") pod \"manila-operator-controller-manager-7c79b5df47-zjnwr\" (UID: \"8adbadf1-f21d-4a09-acf7-d44a87bee356\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.291364 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.291312 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.302029 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsf25\" (UniqueName: \"kubernetes.io/projected/cbafd52a-d603-4b8c-a056-9a2a749bee21-kube-api-access-hsf25\") pod \"nova-operator-controller-manager-697bc559fc-7tsfx\" (UID: \"cbafd52a-d603-4b8c-a056-9a2a749bee21\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.302081 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phn2h\" (UniqueName: \"kubernetes.io/projected/b4cb50e3-a93e-49b0-ac9c-6551046dc0be-kube-api-access-phn2h\") pod \"mariadb-operator-controller-manager-56bbcc9d85-4nvpk\" (UID: \"b4cb50e3-a93e-49b0-ac9c-6551046dc0be\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.302155 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xhrx\" (UniqueName: \"kubernetes.io/projected/4cd34243-8404-4cf7-9185-c012700b5814-kube-api-access-4xhrx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-sxl98\" (UID: \"4cd34243-8404-4cf7-9185-c012700b5814\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.306982 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-lp8gn" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.310460 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt594\" (UniqueName: \"kubernetes.io/projected/7f29786e-1f3c-4c92-81ac-4b6110cf03a3-kube-api-access-gt594\") pod \"keystone-operator-controller-manager-7765d96ddf-wgjr8\" (UID: \"7f29786e-1f3c-4c92-81ac-4b6110cf03a3\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.319609 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrdf\" (UniqueName: \"kubernetes.io/projected/a54985ea-4d23-4a65-bd1a-1c9d059ea206-kube-api-access-rvrdf\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.334997 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.371875 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.377836 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phn2h\" (UniqueName: \"kubernetes.io/projected/b4cb50e3-a93e-49b0-ac9c-6551046dc0be-kube-api-access-phn2h\") pod \"mariadb-operator-controller-manager-56bbcc9d85-4nvpk\" (UID: \"b4cb50e3-a93e-49b0-ac9c-6551046dc0be\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.395408 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xhrx\" (UniqueName: \"kubernetes.io/projected/4cd34243-8404-4cf7-9185-c012700b5814-kube-api-access-4xhrx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-sxl98\" (UID: \"4cd34243-8404-4cf7-9185-c012700b5814\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.403179 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsf25\" (UniqueName: \"kubernetes.io/projected/cbafd52a-d603-4b8c-a056-9a2a749bee21-kube-api-access-hsf25\") pod \"nova-operator-controller-manager-697bc559fc-7tsfx\" (UID: \"cbafd52a-d603-4b8c-a056-9a2a749bee21\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.403957 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9597f\" (UniqueName: \"kubernetes.io/projected/5c3add92-6cee-4980-903f-692cfd4cf87c-kube-api-access-9597f\") pod \"octavia-operator-controller-manager-998648c74-8h62v\" (UID: \"5c3add92-6cee-4980-903f-692cfd4cf87c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.428317 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.435454 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.437250 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsf25\" (UniqueName: \"kubernetes.io/projected/cbafd52a-d603-4b8c-a056-9a2a749bee21-kube-api-access-hsf25\") pod \"nova-operator-controller-manager-697bc559fc-7tsfx\" (UID: \"cbafd52a-d603-4b8c-a056-9a2a749bee21\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.438209 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.442122 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zqtt2" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.443610 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.445661 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.446434 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.450911 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-jb8vn" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.460142 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-8h62v"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.469407 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.473144 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.496906 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.500697 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-gsp46"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.501951 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.519236 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.520974 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhjvb\" (UniqueName: \"kubernetes.io/projected/75a825ba-e08d-440f-866d-d32d2ae812f1-kube-api-access-fhjvb\") pod \"ovn-operator-controller-manager-b6456fdb6-k5j8b\" (UID: \"75a825ba-e08d-440f-866d-d32d2ae812f1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.521082 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9597f\" (UniqueName: \"kubernetes.io/projected/5c3add92-6cee-4980-903f-692cfd4cf87c-kube-api-access-9597f\") pod \"octavia-operator-controller-manager-998648c74-8h62v\" (UID: \"5c3add92-6cee-4980-903f-692cfd4cf87c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.521316 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7rh9\" (UniqueName: \"kubernetes.io/projected/17b09c23-21ca-4060-840d-acbf71e22d55-kube-api-access-s7rh9\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.536568 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-lxwt2" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.536584 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.537730 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.542014 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.542744 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.545289 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-h9r8s" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.560433 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-gsp46"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.563382 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9597f\" (UniqueName: \"kubernetes.io/projected/5c3add92-6cee-4980-903f-692cfd4cf87c-kube-api-access-9597f\") pod \"octavia-operator-controller-manager-998648c74-8h62v\" (UID: \"5c3add92-6cee-4980-903f-692cfd4cf87c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.579530 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.580749 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.582714 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-q8w5h" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.589767 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.601633 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.602729 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.606854 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rcn8p" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.607727 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.612421 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.622155 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjv5j\" (UniqueName: \"kubernetes.io/projected/5434f233-b204-4db9-a93d-93d4342e4514-kube-api-access-kjv5j\") pod \"placement-operator-controller-manager-78f8948974-gsp46\" (UID: \"5434f233-b204-4db9-a93d-93d4342e4514\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.622215 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.622245 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhjvb\" (UniqueName: \"kubernetes.io/projected/75a825ba-e08d-440f-866d-d32d2ae812f1-kube-api-access-fhjvb\") pod \"ovn-operator-controller-manager-b6456fdb6-k5j8b\" (UID: \"75a825ba-e08d-440f-866d-d32d2ae812f1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.622292 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p468j\" (UniqueName: \"kubernetes.io/projected/2011fe70-e44a-4b63-8064-e3234a639fb8-kube-api-access-p468j\") pod \"telemetry-operator-controller-manager-9598fff97-l6gxb\" (UID: \"2011fe70-e44a-4b63-8064-e3234a639fb8\") " pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.622322 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7rh9\" (UniqueName: \"kubernetes.io/projected/17b09c23-21ca-4060-840d-acbf71e22d55-kube-api-access-s7rh9\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.622367 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94hbt\" (UniqueName: \"kubernetes.io/projected/81e287a7-6973-4561-a67a-a8783b0cedf5-kube-api-access-94hbt\") pod \"swift-operator-controller-manager-5f8c65bbfc-w9fcj\" (UID: \"81e287a7-6973-4561-a67a-a8783b0cedf5\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.622523 4751 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.622588 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert podName:17b09c23-21ca-4060-840d-acbf71e22d55 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:51.122567565 +0000 UTC m=+1118.110922772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" (UID: "17b09c23-21ca-4060-840d-acbf71e22d55") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.622832 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.628055 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.633084 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.640645 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-pqsd5" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.647613 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhjvb\" (UniqueName: \"kubernetes.io/projected/75a825ba-e08d-440f-866d-d32d2ae812f1-kube-api-access-fhjvb\") pod \"ovn-operator-controller-manager-b6456fdb6-k5j8b\" (UID: \"75a825ba-e08d-440f-866d-d32d2ae812f1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.657721 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7rh9\" (UniqueName: \"kubernetes.io/projected/17b09c23-21ca-4060-840d-acbf71e22d55-kube-api-access-s7rh9\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.662169 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.670488 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.702168 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.713468 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.714403 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.715819 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jhc4l" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.719986 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.720785 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.723039 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.724547 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p468j\" (UniqueName: \"kubernetes.io/projected/2011fe70-e44a-4b63-8064-e3234a639fb8-kube-api-access-p468j\") pod \"telemetry-operator-controller-manager-9598fff97-l6gxb\" (UID: \"2011fe70-e44a-4b63-8064-e3234a639fb8\") " pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.724911 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94hbt\" (UniqueName: \"kubernetes.io/projected/81e287a7-6973-4561-a67a-a8783b0cedf5-kube-api-access-94hbt\") pod \"swift-operator-controller-manager-5f8c65bbfc-w9fcj\" (UID: \"81e287a7-6973-4561-a67a-a8783b0cedf5\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.725095 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gvwd\" (UniqueName: \"kubernetes.io/projected/adff5e75-192d-4a27-a477-aa74dab8dd95-kube-api-access-5gvwd\") pod \"test-operator-controller-manager-5854674fcc-lgjvh\" (UID: \"adff5e75-192d-4a27-a477-aa74dab8dd95\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.725149 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlmkv\" (UniqueName: \"kubernetes.io/projected/5950fcf6-2983-4341-ba48-12c27801a57e-kube-api-access-dlmkv\") pod \"watcher-operator-controller-manager-769dc69bc-qgvqn\" (UID: \"5950fcf6-2983-4341-ba48-12c27801a57e\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.725242 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjv5j\" (UniqueName: \"kubernetes.io/projected/5434f233-b204-4db9-a93d-93d4342e4514-kube-api-access-kjv5j\") pod \"placement-operator-controller-manager-78f8948974-gsp46\" (UID: \"5434f233-b204-4db9-a93d-93d4342e4514\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.725270 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.725655 4751 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.725803 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert podName:a54985ea-4d23-4a65-bd1a-1c9d059ea206 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:51.72578428 +0000 UTC m=+1118.714139497 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert") pod "infra-operator-controller-manager-57548d458d-ppb75" (UID: "a54985ea-4d23-4a65-bd1a-1c9d059ea206") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.744649 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94hbt\" (UniqueName: \"kubernetes.io/projected/81e287a7-6973-4561-a67a-a8783b0cedf5-kube-api-access-94hbt\") pod \"swift-operator-controller-manager-5f8c65bbfc-w9fcj\" (UID: \"81e287a7-6973-4561-a67a-a8783b0cedf5\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.748879 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjv5j\" (UniqueName: \"kubernetes.io/projected/5434f233-b204-4db9-a93d-93d4342e4514-kube-api-access-kjv5j\") pod \"placement-operator-controller-manager-78f8948974-gsp46\" (UID: \"5434f233-b204-4db9-a93d-93d4342e4514\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.751395 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.752353 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.754713 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p468j\" (UniqueName: \"kubernetes.io/projected/2011fe70-e44a-4b63-8064-e3234a639fb8-kube-api-access-p468j\") pod \"telemetry-operator-controller-manager-9598fff97-l6gxb\" (UID: \"2011fe70-e44a-4b63-8064-e3234a639fb8\") " pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.755049 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-9n7rv" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.778628 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.823525 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.829273 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn5sn\" (UniqueName: \"kubernetes.io/projected/7a4eb3e2-25fa-43e4-9e49-135b5c087014-kube-api-access-zn5sn\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.829590 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gvwd\" (UniqueName: \"kubernetes.io/projected/adff5e75-192d-4a27-a477-aa74dab8dd95-kube-api-access-5gvwd\") pod \"test-operator-controller-manager-5854674fcc-lgjvh\" (UID: \"adff5e75-192d-4a27-a477-aa74dab8dd95\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.829704 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlmkv\" (UniqueName: \"kubernetes.io/projected/5950fcf6-2983-4341-ba48-12c27801a57e-kube-api-access-dlmkv\") pod \"watcher-operator-controller-manager-769dc69bc-qgvqn\" (UID: \"5950fcf6-2983-4341-ba48-12c27801a57e\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.829816 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7qlr\" (UniqueName: \"kubernetes.io/projected/ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d-kube-api-access-k7qlr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4thwz\" (UID: \"ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.829950 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.830071 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.850424 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gvwd\" (UniqueName: \"kubernetes.io/projected/adff5e75-192d-4a27-a477-aa74dab8dd95-kube-api-access-5gvwd\") pod \"test-operator-controller-manager-5854674fcc-lgjvh\" (UID: \"adff5e75-192d-4a27-a477-aa74dab8dd95\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.857238 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlmkv\" (UniqueName: \"kubernetes.io/projected/5950fcf6-2983-4341-ba48-12c27801a57e-kube-api-access-dlmkv\") pod \"watcher-operator-controller-manager-769dc69bc-qgvqn\" (UID: \"5950fcf6-2983-4341-ba48-12c27801a57e\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.875240 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.890898 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.908675 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.909002 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z"] Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.946817 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.948261 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn5sn\" (UniqueName: \"kubernetes.io/projected/7a4eb3e2-25fa-43e4-9e49-135b5c087014-kube-api-access-zn5sn\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.948310 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7qlr\" (UniqueName: \"kubernetes.io/projected/ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d-kube-api-access-k7qlr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4thwz\" (UID: \"ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.948374 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.948412 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.948588 4751 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.948634 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:51.4486198 +0000 UTC m=+1118.436975017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "webhook-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.948898 4751 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: E1203 14:31:50.948932 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:51.448921939 +0000 UTC m=+1118.437277156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "metrics-server-cert" not found Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.969266 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7qlr\" (UniqueName: \"kubernetes.io/projected/ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d-kube-api-access-k7qlr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4thwz\" (UID: \"ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.974007 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn5sn\" (UniqueName: \"kubernetes.io/projected/7a4eb3e2-25fa-43e4-9e49-135b5c087014-kube-api-access-zn5sn\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:50 crc kubenswrapper[4751]: I1203 14:31:50.979247 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.018318 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.083304 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.126537 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.139905 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" event={"ID":"ff012f7f-3431-472a-8b44-1fa7a47e74e1","Type":"ContainerStarted","Data":"c672f4d92200a3cfb0e1c7695076a910fa6e1e8ba065e35fa708dad1ca9a0181"} Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.149403 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" event={"ID":"a975003d-b7d2-4a95-8571-571bc082021d","Type":"ContainerStarted","Data":"3412486aa5098cc5273c5d1edfe3f4bdd4e2738701c8919185bc7bccaec96aa2"} Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.152120 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.152287 4751 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.152345 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert podName:17b09c23-21ca-4060-840d-acbf71e22d55 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:52.152316723 +0000 UTC m=+1119.140671940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" (UID: "17b09c23-21ca-4060-840d-acbf71e22d55") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.155992 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" event={"ID":"b112bf8e-175b-4bc3-9840-6d134b4a1bce","Type":"ContainerStarted","Data":"4caadc546f326388a043a92dc55a3092a3e1aa0c56d75b5eee8286ce06cf5e9a"} Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.254077 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.261727 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.269297 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.288614 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8"] Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.349194 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod618af04c_a37d_4d21_bdba_345c9a63be07.slice/crio-eb99e5e95b4b3bc861f1144b2635769bb15c71a10933c5ef91076c278efcf080 WatchSource:0}: Error finding container eb99e5e95b4b3bc861f1144b2635769bb15c71a10933c5ef91076c278efcf080: Status 404 returned error can't find the container with id eb99e5e95b4b3bc861f1144b2635769bb15c71a10933c5ef91076c278efcf080 Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.402335 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8"] Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.407074 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cb50e3_a93e_49b0_ac9c_6551046dc0be.slice/crio-0e876df02549b530c5ff9068e2cca0e06226a840c14d780f8c4b5aaa097bfce2 WatchSource:0}: Error finding container 0e876df02549b530c5ff9068e2cca0e06226a840c14d780f8c4b5aaa097bfce2: Status 404 returned error can't find the container with id 0e876df02549b530c5ff9068e2cca0e06226a840c14d780f8c4b5aaa097bfce2 Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.407921 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f29786e_1f3c_4c92_81ac_4b6110cf03a3.slice/crio-52dae68c292cc8a2f988a23aa26aeede2f37a9ba9fda242bcefe1a9a7a6c15f9 WatchSource:0}: Error finding container 52dae68c292cc8a2f988a23aa26aeede2f37a9ba9fda242bcefe1a9a7a6c15f9: Status 404 returned error can't find the container with id 52dae68c292cc8a2f988a23aa26aeede2f37a9ba9fda242bcefe1a9a7a6c15f9 Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.414802 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8adbadf1_f21d_4a09_acf7_d44a87bee356.slice/crio-8ad442cdd920a2bff47a5e56ad7b77ec1ed6d6d8826b00437366b89d596bd481 WatchSource:0}: Error finding container 8ad442cdd920a2bff47a5e56ad7b77ec1ed6d6d8826b00437366b89d596bd481: Status 404 returned error can't find the container with id 8ad442cdd920a2bff47a5e56ad7b77ec1ed6d6d8826b00437366b89d596bd481 Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.419235 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.426767 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.455409 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.455468 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.455607 4751 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.455654 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:52.455640352 +0000 UTC m=+1119.443995569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "webhook-server-cert" not found Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.456027 4751 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.456079 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:52.456049513 +0000 UTC m=+1119.444404720 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "metrics-server-cert" not found Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.583464 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-8h62v"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.595910 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.629362 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.732063 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.740206 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.747060 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-gsp46"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.753419 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.756674 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj"] Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.759528 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.759727 4751 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.759769 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert podName:a54985ea-4d23-4a65-bd1a-1c9d059ea206 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:53.759756702 +0000 UTC m=+1120.748111919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert") pod "infra-operator-controller-manager-57548d458d-ppb75" (UID: "a54985ea-4d23-4a65-bd1a-1c9d059ea206") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.762457 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2011fe70_e44a_4b63_8064_e3234a639fb8.slice/crio-1aa929b52c198a9e7b381afb995d3ab2e521385017b27163edf674211920f460 WatchSource:0}: Error finding container 1aa929b52c198a9e7b381afb995d3ab2e521385017b27163edf674211920f460: Status 404 returned error can't find the container with id 1aa929b52c198a9e7b381afb995d3ab2e521385017b27163edf674211920f460 Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.764842 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5950fcf6_2983_4341_ba48_12c27801a57e.slice/crio-e7fee51da1e8406c0b6947e05fc309b9c60a5e5d17647851e98c71d062cb92a0 WatchSource:0}: Error finding container e7fee51da1e8406c0b6947e05fc309b9c60a5e5d17647851e98c71d062cb92a0: Status 404 returned error can't find the container with id e7fee51da1e8406c0b6947e05fc309b9c60a5e5d17647851e98c71d062cb92a0 Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.766580 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.47:5001/openstack-k8s-operators/telemetry-operator:e39d5f8652fd394c3fcc2c0989e45436c83851fe,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p468j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-9598fff97-l6gxb_openstack-operators(2011fe70-e44a-4b63-8064-e3234a639fb8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.767486 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlmkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-qgvqn_openstack-operators(5950fcf6-2983-4341-ba48-12c27801a57e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.768483 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p468j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-9598fff97-l6gxb_openstack-operators(2011fe70-e44a-4b63-8064-e3234a639fb8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.769534 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" podUID="2011fe70-e44a-4b63-8064-e3234a639fb8" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.769554 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlmkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-qgvqn_openstack-operators(5950fcf6-2983-4341-ba48-12c27801a57e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.771058 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" podUID="5950fcf6-2983-4341-ba48-12c27801a57e" Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.771059 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e287a7_6973_4561_a67a_a8783b0cedf5.slice/crio-9fd67c061c8a4f4dfbf6f7a0858938148032469c1c24f4164b337e529c15f816 WatchSource:0}: Error finding container 9fd67c061c8a4f4dfbf6f7a0858938148032469c1c24f4164b337e529c15f816: Status 404 returned error can't find the container with id 9fd67c061c8a4f4dfbf6f7a0858938148032469c1c24f4164b337e529c15f816 Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.771507 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75a825ba_e08d_440f_866d_d32d2ae812f1.slice/crio-f87199fd1ace8b822545a1eb3312c696a5e60ef8cfe5b2c53b4a7ed183ed4708 WatchSource:0}: Error finding container f87199fd1ace8b822545a1eb3312c696a5e60ef8cfe5b2c53b4a7ed183ed4708: Status 404 returned error can't find the container with id f87199fd1ace8b822545a1eb3312c696a5e60ef8cfe5b2c53b4a7ed183ed4708 Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.775988 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhjvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-k5j8b_openstack-operators(75a825ba-e08d-440f-866d-d32d2ae812f1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.776122 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-94hbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-w9fcj_openstack-operators(81e287a7-6973-4561-a67a-a8783b0cedf5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.779042 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-94hbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-w9fcj_openstack-operators(81e287a7-6973-4561-a67a-a8783b0cedf5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.779058 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhjvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-k5j8b_openstack-operators(75a825ba-e08d-440f-866d-d32d2ae812f1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.780221 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" podUID="81e287a7-6973-4561-a67a-a8783b0cedf5" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.780231 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" podUID="75a825ba-e08d-440f-866d-d32d2ae812f1" Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.873736 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh"] Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.881141 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadff5e75_192d_4a27_a477_aa74dab8dd95.slice/crio-42d71148deb01662ceee46e836d6aeb3bb24c5d5459d5ba354cb334d943533e2 WatchSource:0}: Error finding container 42d71148deb01662ceee46e836d6aeb3bb24c5d5459d5ba354cb334d943533e2: Status 404 returned error can't find the container with id 42d71148deb01662ceee46e836d6aeb3bb24c5d5459d5ba354cb334d943533e2 Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.883994 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5gvwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-lgjvh_openstack-operators(adff5e75-192d-4a27-a477-aa74dab8dd95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: I1203 14:31:51.884075 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz"] Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.886078 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5gvwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-lgjvh_openstack-operators(adff5e75-192d-4a27-a477-aa74dab8dd95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:31:51 crc kubenswrapper[4751]: E1203 14:31:51.887859 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" podUID="adff5e75-192d-4a27-a477-aa74dab8dd95" Dec 03 14:31:51 crc kubenswrapper[4751]: W1203 14:31:51.894764 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5fb8ca_3372_4c92_a4d2_9ff4b543f94d.slice/crio-74d280fef9aef5ef71d309f4bc065d17eb151cdcfbf9a6c473f8a339fba3b370 WatchSource:0}: Error finding container 74d280fef9aef5ef71d309f4bc065d17eb151cdcfbf9a6c473f8a339fba3b370: Status 404 returned error can't find the container with id 74d280fef9aef5ef71d309f4bc065d17eb151cdcfbf9a6c473f8a339fba3b370 Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.163915 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.164145 4751 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.164209 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert podName:17b09c23-21ca-4060-840d-acbf71e22d55 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:54.164195226 +0000 UTC m=+1121.152550443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" (UID: "17b09c23-21ca-4060-840d-acbf71e22d55") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.167793 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" event={"ID":"adff5e75-192d-4a27-a477-aa74dab8dd95","Type":"ContainerStarted","Data":"42d71148deb01662ceee46e836d6aeb3bb24c5d5459d5ba354cb334d943533e2"} Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.170856 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" podUID="adff5e75-192d-4a27-a477-aa74dab8dd95" Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.178770 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" event={"ID":"5950fcf6-2983-4341-ba48-12c27801a57e","Type":"ContainerStarted","Data":"e7fee51da1e8406c0b6947e05fc309b9c60a5e5d17647851e98c71d062cb92a0"} Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.182183 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" podUID="5950fcf6-2983-4341-ba48-12c27801a57e" Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.190190 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" event={"ID":"b4cb50e3-a93e-49b0-ac9c-6551046dc0be","Type":"ContainerStarted","Data":"0e876df02549b530c5ff9068e2cca0e06226a840c14d780f8c4b5aaa097bfce2"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.193382 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" event={"ID":"7f29786e-1f3c-4c92-81ac-4b6110cf03a3","Type":"ContainerStarted","Data":"52dae68c292cc8a2f988a23aa26aeede2f37a9ba9fda242bcefe1a9a7a6c15f9"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.194802 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" event={"ID":"b5d6b394-fe97-4e70-9916-9c6791379931","Type":"ContainerStarted","Data":"9afdf4f54db69454075538f9f699e833aae5fe0b621c84e51669f33dea440d21"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.199010 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" event={"ID":"75a825ba-e08d-440f-866d-d32d2ae812f1","Type":"ContainerStarted","Data":"f87199fd1ace8b822545a1eb3312c696a5e60ef8cfe5b2c53b4a7ed183ed4708"} Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.205707 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" podUID="75a825ba-e08d-440f-866d-d32d2ae812f1" Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.208349 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" event={"ID":"5434f233-b204-4db9-a93d-93d4342e4514","Type":"ContainerStarted","Data":"97789c8d1b488dd0b1c48dbff5d2d2a3450f524b87b9fb210ea6025b23924fa3"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.211026 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" event={"ID":"4cd34243-8404-4cf7-9185-c012700b5814","Type":"ContainerStarted","Data":"7e83b27ec79c9d20aca3e75f6238e61215e738e53c1d42465eb151fa8476cb1a"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.229524 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" event={"ID":"2f31e262-8f03-4689-bc29-5d9d8b33a2cc","Type":"ContainerStarted","Data":"ac58f86bc16960f716f1af42d53bd92ae32d84f87963ac4100a3121b44cfeb7d"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.234342 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" event={"ID":"5c3add92-6cee-4980-903f-692cfd4cf87c","Type":"ContainerStarted","Data":"87e3b6bb7bdaa9217f2d8073108bb950cbabd0089548f0bbad9756a0cede062c"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.244248 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" event={"ID":"2011fe70-e44a-4b63-8064-e3234a639fb8","Type":"ContainerStarted","Data":"1aa929b52c198a9e7b381afb995d3ab2e521385017b27163edf674211920f460"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.246570 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" event={"ID":"81e287a7-6973-4561-a67a-a8783b0cedf5","Type":"ContainerStarted","Data":"9fd67c061c8a4f4dfbf6f7a0858938148032469c1c24f4164b337e529c15f816"} Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.248569 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.47:5001/openstack-k8s-operators/telemetry-operator:e39d5f8652fd394c3fcc2c0989e45436c83851fe\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" podUID="2011fe70-e44a-4b63-8064-e3234a639fb8" Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.249467 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" event={"ID":"ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d","Type":"ContainerStarted","Data":"74d280fef9aef5ef71d309f4bc065d17eb151cdcfbf9a6c473f8a339fba3b370"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.254846 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" event={"ID":"cbafd52a-d603-4b8c-a056-9a2a749bee21","Type":"ContainerStarted","Data":"ab42205d07c27dd03b460bddd0e3c1f330f06fdc8738974357ed590f75f79d22"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.257679 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" event={"ID":"8adbadf1-f21d-4a09-acf7-d44a87bee356","Type":"ContainerStarted","Data":"8ad442cdd920a2bff47a5e56ad7b77ec1ed6d6d8826b00437366b89d596bd481"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.273950 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" event={"ID":"26689286-a791-485e-b442-9e399ae7a79b","Type":"ContainerStarted","Data":"9fceeb10d43c2d2862d884d641697454c67e2844ef8518371ce9a8b92c332e0b"} Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.284543 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" podUID="81e287a7-6973-4561-a67a-a8783b0cedf5" Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.285899 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" event={"ID":"618af04c-a37d-4d21-bdba-345c9a63be07","Type":"ContainerStarted","Data":"eb99e5e95b4b3bc861f1144b2635769bb15c71a10933c5ef91076c278efcf080"} Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.468548 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:52 crc kubenswrapper[4751]: I1203 14:31:52.468630 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.468799 4751 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.468802 4751 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.468869 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:54.468850482 +0000 UTC m=+1121.457205699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "webhook-server-cert" not found Dec 03 14:31:52 crc kubenswrapper[4751]: E1203 14:31:52.468949 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:54.468935714 +0000 UTC m=+1121.457290941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "metrics-server-cert" not found Dec 03 14:31:53 crc kubenswrapper[4751]: E1203 14:31:53.304776 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" podUID="81e287a7-6973-4561-a67a-a8783b0cedf5" Dec 03 14:31:53 crc kubenswrapper[4751]: E1203 14:31:53.304822 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" podUID="75a825ba-e08d-440f-866d-d32d2ae812f1" Dec 03 14:31:53 crc kubenswrapper[4751]: E1203 14:31:53.305594 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.47:5001/openstack-k8s-operators/telemetry-operator:e39d5f8652fd394c3fcc2c0989e45436c83851fe\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" podUID="2011fe70-e44a-4b63-8064-e3234a639fb8" Dec 03 14:31:53 crc kubenswrapper[4751]: E1203 14:31:53.306056 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" podUID="adff5e75-192d-4a27-a477-aa74dab8dd95" Dec 03 14:31:53 crc kubenswrapper[4751]: E1203 14:31:53.306225 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" podUID="5950fcf6-2983-4341-ba48-12c27801a57e" Dec 03 14:31:53 crc kubenswrapper[4751]: I1203 14:31:53.792517 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:53 crc kubenswrapper[4751]: E1203 14:31:53.793465 4751 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:53 crc kubenswrapper[4751]: E1203 14:31:53.793526 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert podName:a54985ea-4d23-4a65-bd1a-1c9d059ea206 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:57.793510685 +0000 UTC m=+1124.781865892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert") pod "infra-operator-controller-manager-57548d458d-ppb75" (UID: "a54985ea-4d23-4a65-bd1a-1c9d059ea206") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:54 crc kubenswrapper[4751]: I1203 14:31:54.199560 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:54 crc kubenswrapper[4751]: E1203 14:31:54.199730 4751 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:54 crc kubenswrapper[4751]: E1203 14:31:54.199801 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert podName:17b09c23-21ca-4060-840d-acbf71e22d55 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:58.19978222 +0000 UTC m=+1125.188137437 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" (UID: "17b09c23-21ca-4060-840d-acbf71e22d55") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:54 crc kubenswrapper[4751]: I1203 14:31:54.503633 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:54 crc kubenswrapper[4751]: I1203 14:31:54.503730 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:54 crc kubenswrapper[4751]: E1203 14:31:54.503765 4751 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:31:54 crc kubenswrapper[4751]: E1203 14:31:54.503822 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:58.503807078 +0000 UTC m=+1125.492162295 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "metrics-server-cert" not found Dec 03 14:31:54 crc kubenswrapper[4751]: E1203 14:31:54.503845 4751 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:31:54 crc kubenswrapper[4751]: E1203 14:31:54.503876 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:31:58.50386667 +0000 UTC m=+1125.492221887 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "webhook-server-cert" not found Dec 03 14:31:57 crc kubenswrapper[4751]: I1203 14:31:57.851865 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:31:57 crc kubenswrapper[4751]: E1203 14:31:57.852089 4751 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:57 crc kubenswrapper[4751]: E1203 14:31:57.853391 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert podName:a54985ea-4d23-4a65-bd1a-1c9d059ea206 nodeName:}" failed. No retries permitted until 2025-12-03 14:32:05.8533667 +0000 UTC m=+1132.841721917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert") pod "infra-operator-controller-manager-57548d458d-ppb75" (UID: "a54985ea-4d23-4a65-bd1a-1c9d059ea206") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:31:58 crc kubenswrapper[4751]: I1203 14:31:58.263100 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:31:58 crc kubenswrapper[4751]: E1203 14:31:58.263352 4751 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:58 crc kubenswrapper[4751]: E1203 14:31:58.263619 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert podName:17b09c23-21ca-4060-840d-acbf71e22d55 nodeName:}" failed. No retries permitted until 2025-12-03 14:32:06.263603175 +0000 UTC m=+1133.251958392 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" (UID: "17b09c23-21ca-4060-840d-acbf71e22d55") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:31:58 crc kubenswrapper[4751]: I1203 14:31:58.568447 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:58 crc kubenswrapper[4751]: I1203 14:31:58.568545 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:31:58 crc kubenswrapper[4751]: E1203 14:31:58.568832 4751 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:31:58 crc kubenswrapper[4751]: E1203 14:31:58.568954 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:32:06.568900488 +0000 UTC m=+1133.557255705 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "metrics-server-cert" not found Dec 03 14:31:58 crc kubenswrapper[4751]: E1203 14:31:58.568984 4751 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:31:58 crc kubenswrapper[4751]: E1203 14:31:58.569049 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:32:06.569030461 +0000 UTC m=+1133.557385778 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "webhook-server-cert" not found Dec 03 14:32:05 crc kubenswrapper[4751]: E1203 14:32:05.223648 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 03 14:32:05 crc kubenswrapper[4751]: E1203 14:32:05.224545 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2rbm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-8hhdk_openstack-operators(b5d6b394-fe97-4e70-9916-9c6791379931): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:05 crc kubenswrapper[4751]: I1203 14:32:05.880460 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:32:05 crc kubenswrapper[4751]: I1203 14:32:05.889817 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54985ea-4d23-4a65-bd1a-1c9d059ea206-cert\") pod \"infra-operator-controller-manager-57548d458d-ppb75\" (UID: \"a54985ea-4d23-4a65-bd1a-1c9d059ea206\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:32:06 crc kubenswrapper[4751]: I1203 14:32:06.174705 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:32:06 crc kubenswrapper[4751]: I1203 14:32:06.285804 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:32:06 crc kubenswrapper[4751]: I1203 14:32:06.290966 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17b09c23-21ca-4060-840d-acbf71e22d55-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b\" (UID: \"17b09c23-21ca-4060-840d-acbf71e22d55\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:32:06 crc kubenswrapper[4751]: I1203 14:32:06.392854 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:32:06 crc kubenswrapper[4751]: I1203 14:32:06.591130 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:32:06 crc kubenswrapper[4751]: E1203 14:32:06.591320 4751 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:32:06 crc kubenswrapper[4751]: I1203 14:32:06.591858 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:32:06 crc kubenswrapper[4751]: E1203 14:32:06.591898 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:32:22.591879992 +0000 UTC m=+1149.580235209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "metrics-server-cert" not found Dec 03 14:32:06 crc kubenswrapper[4751]: E1203 14:32:06.591982 4751 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:32:06 crc kubenswrapper[4751]: E1203 14:32:06.592025 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs podName:7a4eb3e2-25fa-43e4-9e49-135b5c087014 nodeName:}" failed. No retries permitted until 2025-12-03 14:32:22.592016825 +0000 UTC m=+1149.580372042 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs") pod "openstack-operator-controller-manager-7f9b9ccb84-v5t4x" (UID: "7a4eb3e2-25fa-43e4-9e49-135b5c087014") : secret "webhook-server-cert" not found Dec 03 14:32:14 crc kubenswrapper[4751]: E1203 14:32:14.072026 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 03 14:32:14 crc kubenswrapper[4751]: E1203 14:32:14.073062 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ll5m5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-7422h_openstack-operators(b112bf8e-175b-4bc3-9840-6d134b4a1bce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:14 crc kubenswrapper[4751]: E1203 14:32:14.566583 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 03 14:32:14 crc kubenswrapper[4751]: E1203 14:32:14.566814 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5c6qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-qswcx_openstack-operators(26689286-a791-485e-b442-9e399ae7a79b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:15 crc kubenswrapper[4751]: E1203 14:32:15.150858 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 03 14:32:15 crc kubenswrapper[4751]: E1203 14:32:15.151099 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kjv5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-gsp46_openstack-operators(5434f233-b204-4db9-a93d-93d4342e4514): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:15 crc kubenswrapper[4751]: E1203 14:32:15.910245 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 03 14:32:15 crc kubenswrapper[4751]: E1203 14:32:15.910819 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4xhrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-sxl98_openstack-operators(4cd34243-8404-4cf7-9185-c012700b5814): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:16 crc kubenswrapper[4751]: E1203 14:32:16.477023 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 03 14:32:16 crc kubenswrapper[4751]: E1203 14:32:16.477219 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-phn2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-4nvpk_openstack-operators(b4cb50e3-a93e-49b0-ac9c-6551046dc0be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:17 crc kubenswrapper[4751]: E1203 14:32:17.167580 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 03 14:32:17 crc kubenswrapper[4751]: E1203 14:32:17.167777 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mndwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-6shnx_openstack-operators(a975003d-b7d2-4a95-8571-571bc082021d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:17 crc kubenswrapper[4751]: E1203 14:32:17.800653 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 03 14:32:17 crc kubenswrapper[4751]: E1203 14:32:17.800907 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gt594,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-wgjr8_openstack-operators(7f29786e-1f3c-4c92-81ac-4b6110cf03a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:19 crc kubenswrapper[4751]: E1203 14:32:19.123892 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 03 14:32:19 crc kubenswrapper[4751]: E1203 14:32:19.124430 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9597f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-8h62v_openstack-operators(5c3add92-6cee-4980-903f-692cfd4cf87c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:19 crc kubenswrapper[4751]: E1203 14:32:19.682635 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 03 14:32:19 crc kubenswrapper[4751]: E1203 14:32:19.683150 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hsf25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-7tsfx_openstack-operators(cbafd52a-d603-4b8c-a056-9a2a749bee21): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:20 crc kubenswrapper[4751]: E1203 14:32:20.078313 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 03 14:32:20 crc kubenswrapper[4751]: E1203 14:32:20.078497 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k7qlr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-4thwz_openstack-operators(ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:32:20 crc kubenswrapper[4751]: E1203 14:32:20.079593 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" podUID="ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d" Dec 03 14:32:20 crc kubenswrapper[4751]: E1203 14:32:20.510447 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" podUID="ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d" Dec 03 14:32:22 crc kubenswrapper[4751]: I1203 14:32:22.627245 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:32:22 crc kubenswrapper[4751]: I1203 14:32:22.627618 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:32:22 crc kubenswrapper[4751]: I1203 14:32:22.633691 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-metrics-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:32:22 crc kubenswrapper[4751]: I1203 14:32:22.633737 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a4eb3e2-25fa-43e4-9e49-135b5c087014-webhook-certs\") pod \"openstack-operator-controller-manager-7f9b9ccb84-v5t4x\" (UID: \"7a4eb3e2-25fa-43e4-9e49-135b5c087014\") " pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:32:22 crc kubenswrapper[4751]: I1203 14:32:22.856648 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jhc4l" Dec 03 14:32:22 crc kubenswrapper[4751]: I1203 14:32:22.865086 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:32:23 crc kubenswrapper[4751]: I1203 14:32:23.174059 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b"] Dec 03 14:32:23 crc kubenswrapper[4751]: I1203 14:32:23.368684 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-ppb75"] Dec 03 14:32:23 crc kubenswrapper[4751]: W1203 14:32:23.389915 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54985ea_4d23_4a65_bd1a_1c9d059ea206.slice/crio-deb536a34489908d25e749a689323689b970fbe166395438d53a679532bea4a3 WatchSource:0}: Error finding container deb536a34489908d25e749a689323689b970fbe166395438d53a679532bea4a3: Status 404 returned error can't find the container with id deb536a34489908d25e749a689323689b970fbe166395438d53a679532bea4a3 Dec 03 14:32:23 crc kubenswrapper[4751]: I1203 14:32:23.550373 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" event={"ID":"17b09c23-21ca-4060-840d-acbf71e22d55","Type":"ContainerStarted","Data":"c8b8b49c3d555a42176ad7834a88c4a5602c86bfe5bbfe490008ec5990c4782a"} Dec 03 14:32:23 crc kubenswrapper[4751]: I1203 14:32:23.557944 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" event={"ID":"ff012f7f-3431-472a-8b44-1fa7a47e74e1","Type":"ContainerStarted","Data":"eb5a98f0e666c43aab4c60438de3a80aac74494f70fbe5b0cfde4f7af51039ad"} Dec 03 14:32:23 crc kubenswrapper[4751]: I1203 14:32:23.559024 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" event={"ID":"a54985ea-4d23-4a65-bd1a-1c9d059ea206","Type":"ContainerStarted","Data":"deb536a34489908d25e749a689323689b970fbe166395438d53a679532bea4a3"} Dec 03 14:32:23 crc kubenswrapper[4751]: I1203 14:32:23.913379 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x"] Dec 03 14:32:23 crc kubenswrapper[4751]: W1203 14:32:23.975617 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4eb3e2_25fa_43e4_9e49_135b5c087014.slice/crio-4ca4ad4080e4de1736224594fd00fdcdadcd130fe539d2ae450141fe3d6ac9e3 WatchSource:0}: Error finding container 4ca4ad4080e4de1736224594fd00fdcdadcd130fe539d2ae450141fe3d6ac9e3: Status 404 returned error can't find the container with id 4ca4ad4080e4de1736224594fd00fdcdadcd130fe539d2ae450141fe3d6ac9e3 Dec 03 14:32:24 crc kubenswrapper[4751]: I1203 14:32:24.580775 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" event={"ID":"2f31e262-8f03-4689-bc29-5d9d8b33a2cc","Type":"ContainerStarted","Data":"6ca05e111868ddd162fb73ecf1ab34dde71e530e27363ba2c3c20ffa9e1d8aa6"} Dec 03 14:32:24 crc kubenswrapper[4751]: I1203 14:32:24.582803 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" event={"ID":"adff5e75-192d-4a27-a477-aa74dab8dd95","Type":"ContainerStarted","Data":"fe8b7049ab456064b1692afa14ec9956d6d38613968c83901c087ceeff25407e"} Dec 03 14:32:24 crc kubenswrapper[4751]: I1203 14:32:24.585303 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" event={"ID":"5950fcf6-2983-4341-ba48-12c27801a57e","Type":"ContainerStarted","Data":"4f83e4b2c7d14e72772745cd4616e5ac62103ff278928cf96d604819edb8025e"} Dec 03 14:32:24 crc kubenswrapper[4751]: I1203 14:32:24.587907 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" event={"ID":"8adbadf1-f21d-4a09-acf7-d44a87bee356","Type":"ContainerStarted","Data":"14cdaa72f5b8f27246bf2abddcb3690ccc055c1e00140dc23a3b81ad4c5f56ae"} Dec 03 14:32:24 crc kubenswrapper[4751]: I1203 14:32:24.593347 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" event={"ID":"618af04c-a37d-4d21-bdba-345c9a63be07","Type":"ContainerStarted","Data":"36d018f6edd252c8a84698772e14a9d681b2c96dfbb47572fee608c72bc0d6ef"} Dec 03 14:32:24 crc kubenswrapper[4751]: I1203 14:32:24.594857 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" event={"ID":"7a4eb3e2-25fa-43e4-9e49-135b5c087014","Type":"ContainerStarted","Data":"4ca4ad4080e4de1736224594fd00fdcdadcd130fe539d2ae450141fe3d6ac9e3"} Dec 03 14:32:24 crc kubenswrapper[4751]: I1203 14:32:24.596511 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" event={"ID":"81e287a7-6973-4561-a67a-a8783b0cedf5","Type":"ContainerStarted","Data":"21bcea4899fe7902f914962ebf376b4d783366d347ef0e94eda739a0900b8612"} Dec 03 14:32:27 crc kubenswrapper[4751]: I1203 14:32:27.622716 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" event={"ID":"75a825ba-e08d-440f-866d-d32d2ae812f1","Type":"ContainerStarted","Data":"2b7c5c884786fcef62787579cba5cbe3df927c9e4f2987f394df24090803e9c1"} Dec 03 14:32:27 crc kubenswrapper[4751]: E1203 14:32:27.935117 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" podUID="4cd34243-8404-4cf7-9185-c012700b5814" Dec 03 14:32:27 crc kubenswrapper[4751]: E1203 14:32:27.955648 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" podUID="26689286-a791-485e-b442-9e399ae7a79b" Dec 03 14:32:27 crc kubenswrapper[4751]: E1203 14:32:27.956086 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" podUID="b112bf8e-175b-4bc3-9840-6d134b4a1bce" Dec 03 14:32:27 crc kubenswrapper[4751]: E1203 14:32:27.956538 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" podUID="b4cb50e3-a93e-49b0-ac9c-6551046dc0be" Dec 03 14:32:27 crc kubenswrapper[4751]: E1203 14:32:27.979699 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" podUID="5434f233-b204-4db9-a93d-93d4342e4514" Dec 03 14:32:28 crc kubenswrapper[4751]: E1203 14:32:28.324634 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" podUID="7f29786e-1f3c-4c92-81ac-4b6110cf03a3" Dec 03 14:32:28 crc kubenswrapper[4751]: E1203 14:32:28.612831 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" Dec 03 14:32:28 crc kubenswrapper[4751]: E1203 14:32:28.631782 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" podUID="a975003d-b7d2-4a95-8571-571bc082021d" Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.647715 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" event={"ID":"7f29786e-1f3c-4c92-81ac-4b6110cf03a3","Type":"ContainerStarted","Data":"868e6b060a4f059ba2a60ce0407234e457f568e2e4ed7a2ce5fb4317cc540c01"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.657287 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" event={"ID":"75a825ba-e08d-440f-866d-d32d2ae812f1","Type":"ContainerStarted","Data":"3b9593304bb605327732f76975403f3844a85da694c101ddc3c9441e68ef20ef"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.658057 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.674249 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" event={"ID":"5434f233-b204-4db9-a93d-93d4342e4514","Type":"ContainerStarted","Data":"90294e55f141580c04b17f38e1184962bc8955ffecfebc1f6ffb5fd471305cad"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.679937 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" event={"ID":"a54985ea-4d23-4a65-bd1a-1c9d059ea206","Type":"ContainerStarted","Data":"8c50f19098317dc8688c3008ea3adcdfe408ffb548bd1c5b5828bcff26af47e4"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.690467 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" event={"ID":"b4cb50e3-a93e-49b0-ac9c-6551046dc0be","Type":"ContainerStarted","Data":"623111295a9f532ed990f1724d3619ce4e9070c645e89b592469e0bf40e5b7f9"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.696952 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" podStartSLOduration=7.219222686 podStartE2EDuration="38.696938187s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.775818805 +0000 UTC m=+1118.764174022" lastFinishedPulling="2025-12-03 14:32:23.253534306 +0000 UTC m=+1150.241889523" observedRunningTime="2025-12-03 14:32:28.696497585 +0000 UTC m=+1155.684852812" watchObservedRunningTime="2025-12-03 14:32:28.696938187 +0000 UTC m=+1155.685293404" Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.699261 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" event={"ID":"b112bf8e-175b-4bc3-9840-6d134b4a1bce","Type":"ContainerStarted","Data":"636b16bd0d4894c34dd3a114f98b46233cca008880bfe5f28a81e1317e68b142"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.713409 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" event={"ID":"17b09c23-21ca-4060-840d-acbf71e22d55","Type":"ContainerStarted","Data":"d1e026c62840b64d58003f2b576df3b6902ecbfa981988e00f836f33f3f4dd1f"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.714561 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" event={"ID":"b5d6b394-fe97-4e70-9916-9c6791379931","Type":"ContainerStarted","Data":"7e1e06c287c5492559f7b5625d8fcc7a8b89ef4a3acf8ee399b5cac9eaff59d1"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.718012 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" event={"ID":"7a4eb3e2-25fa-43e4-9e49-135b5c087014","Type":"ContainerStarted","Data":"06afaaf0933b403b66210aaf605e6457432dd19d4336e621d8ee54f888428415"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.727309 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.739230 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" event={"ID":"a975003d-b7d2-4a95-8571-571bc082021d","Type":"ContainerStarted","Data":"587730413e3608e8b4f7b6613e62e4abfa2a2dec87a740f9891c673807c65bd5"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.757244 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" event={"ID":"4cd34243-8404-4cf7-9185-c012700b5814","Type":"ContainerStarted","Data":"fefb1d378c3a8df8e6bb1283523597100d2ada3ee0734f7690a34e703cbba09a"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.778189 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" event={"ID":"2011fe70-e44a-4b63-8064-e3234a639fb8","Type":"ContainerStarted","Data":"6a13661b048af537ffbaa224ba0075f7309d1087e381285638a6347b1a8ed658"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.805222 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" event={"ID":"26689286-a791-485e-b442-9e399ae7a79b","Type":"ContainerStarted","Data":"5149a0ec0bec5274cfbaeb34b30a83a1bd2e773dd98abcacf309050fe5f8ffb3"} Dec 03 14:32:28 crc kubenswrapper[4751]: I1203 14:32:28.853682 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" podStartSLOduration=38.853656576 podStartE2EDuration="38.853656576s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:32:28.850697624 +0000 UTC m=+1155.839052851" watchObservedRunningTime="2025-12-03 14:32:28.853656576 +0000 UTC m=+1155.842011793" Dec 03 14:32:29 crc kubenswrapper[4751]: E1203 14:32:29.149884 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" podUID="cbafd52a-d603-4b8c-a056-9a2a749bee21" Dec 03 14:32:29 crc kubenswrapper[4751]: E1203 14:32:29.179763 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" podUID="5c3add92-6cee-4980-903f-692cfd4cf87c" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.832787 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" event={"ID":"b112bf8e-175b-4bc3-9840-6d134b4a1bce","Type":"ContainerStarted","Data":"dac93c6cc331213d0216fe0de638db6a22e4973a9e74393e7271ba91bbb7cb3f"} Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.834049 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.836093 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" event={"ID":"618af04c-a37d-4d21-bdba-345c9a63be07","Type":"ContainerStarted","Data":"566687ef6d0e251de47563ca092777aec5a8340a0fa3b52ce0159a4cabb2299e"} Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.837080 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.843091 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.854862 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" event={"ID":"8adbadf1-f21d-4a09-acf7-d44a87bee356","Type":"ContainerStarted","Data":"aa3719c1d0f807390986354761013997be61d07392f2d55f817c3b2b49786ce3"} Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.856232 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.861663 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.870615 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" event={"ID":"a975003d-b7d2-4a95-8571-571bc082021d","Type":"ContainerStarted","Data":"52b2859ba1722d8614baeb11f468ce9b5c8258d53df918dcc43ac719cbd92eca"} Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.870680 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.871171 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" podStartSLOduration=2.45774183 podStartE2EDuration="40.871154235s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:50.889975454 +0000 UTC m=+1117.878330671" lastFinishedPulling="2025-12-03 14:32:29.303387859 +0000 UTC m=+1156.291743076" observedRunningTime="2025-12-03 14:32:29.870054894 +0000 UTC m=+1156.858410111" watchObservedRunningTime="2025-12-03 14:32:29.871154235 +0000 UTC m=+1156.859509452" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.882161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" event={"ID":"81e287a7-6973-4561-a67a-a8783b0cedf5","Type":"ContainerStarted","Data":"88a572f3685579f6f96d6e6355c26f4488a0e1eb38df5d26bedb73338c540dd5"} Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.883138 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.893551 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zjnwr" podStartSLOduration=4.677068957 podStartE2EDuration="40.893534641s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.416505584 +0000 UTC m=+1118.404860801" lastFinishedPulling="2025-12-03 14:32:27.632971268 +0000 UTC m=+1154.621326485" observedRunningTime="2025-12-03 14:32:29.892706538 +0000 UTC m=+1156.881061755" watchObservedRunningTime="2025-12-03 14:32:29.893534641 +0000 UTC m=+1156.881889868" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.897650 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.906118 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.921001 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-n8td8" podStartSLOduration=4.662561737 podStartE2EDuration="40.920985318s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.354163806 +0000 UTC m=+1118.342519023" lastFinishedPulling="2025-12-03 14:32:27.612587387 +0000 UTC m=+1154.600942604" observedRunningTime="2025-12-03 14:32:29.917592434 +0000 UTC m=+1156.905947651" watchObservedRunningTime="2025-12-03 14:32:29.920985318 +0000 UTC m=+1156.909340535" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.924493 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" event={"ID":"7f29786e-1f3c-4c92-81ac-4b6110cf03a3","Type":"ContainerStarted","Data":"292dc52e18a4f8d88d22dec95540735323d7d1104ef0c178193c8a5d0907f5dd"} Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.925223 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.939718 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" event={"ID":"adff5e75-192d-4a27-a477-aa74dab8dd95","Type":"ContainerStarted","Data":"ee163e6fdd0a9b138b8bc351d8deef1e031a85c5fd309360af0f99cb35cd2020"} Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.941259 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.953568 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.958237 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" event={"ID":"2011fe70-e44a-4b63-8064-e3234a639fb8","Type":"ContainerStarted","Data":"4feac4880cd62468b58c829ec33c81be5e9d535d979ee662a5fb7b8a29c7af0d"} Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.958594 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.993769 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" event={"ID":"5950fcf6-2983-4341-ba48-12c27801a57e","Type":"ContainerStarted","Data":"dbf026f52d242125ce39c6a056d221cfefd0e84399724a91fef3e3802085260d"} Dec 03 14:32:29 crc kubenswrapper[4751]: I1203 14:32:29.995087 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.002222 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" podStartSLOduration=2.778028426 podStartE2EDuration="41.002207596s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.119054767 +0000 UTC m=+1118.107409984" lastFinishedPulling="2025-12-03 14:32:29.343233937 +0000 UTC m=+1156.331589154" observedRunningTime="2025-12-03 14:32:29.9537066 +0000 UTC m=+1156.942061817" watchObservedRunningTime="2025-12-03 14:32:30.002207596 +0000 UTC m=+1156.990562813" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.007890 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.023558 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" event={"ID":"a54985ea-4d23-4a65-bd1a-1c9d059ea206","Type":"ContainerStarted","Data":"c530359820eab2588bbf48ad0f8606dca90f46ea3f02deaaddb60560b1cf7a61"} Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.024220 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.043542 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" event={"ID":"5c3add92-6cee-4980-903f-692cfd4cf87c","Type":"ContainerStarted","Data":"40279d7e67365c4fcca8eb3b158cff1b4c219e34af62e1691317aa897147c922"} Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.044775 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-w9fcj" podStartSLOduration=4.031591477 podStartE2EDuration="40.044755969s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.775781444 +0000 UTC m=+1118.764136661" lastFinishedPulling="2025-12-03 14:32:27.788945936 +0000 UTC m=+1154.777301153" observedRunningTime="2025-12-03 14:32:30.001235859 +0000 UTC m=+1156.989591096" watchObservedRunningTime="2025-12-03 14:32:30.044755969 +0000 UTC m=+1157.033111186" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.060880 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" podStartSLOduration=3.064719586 podStartE2EDuration="41.060857192s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.411996729 +0000 UTC m=+1118.400351946" lastFinishedPulling="2025-12-03 14:32:29.408134335 +0000 UTC m=+1156.396489552" observedRunningTime="2025-12-03 14:32:30.044465611 +0000 UTC m=+1157.032820828" watchObservedRunningTime="2025-12-03 14:32:30.060857192 +0000 UTC m=+1157.049212409" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.064407 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" event={"ID":"17b09c23-21ca-4060-840d-acbf71e22d55","Type":"ContainerStarted","Data":"81265e6c1ff0ace4233bd925d75030dfad3b4b0034219a56bc473ce74c57cfae"} Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.065160 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.083100 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" podStartSLOduration=37.073335561 podStartE2EDuration="41.083077355s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:32:23.393114453 +0000 UTC m=+1150.381469670" lastFinishedPulling="2025-12-03 14:32:27.402856247 +0000 UTC m=+1154.391211464" observedRunningTime="2025-12-03 14:32:30.071918037 +0000 UTC m=+1157.060273254" watchObservedRunningTime="2025-12-03 14:32:30.083077355 +0000 UTC m=+1157.071432572" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.092050 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" event={"ID":"ff012f7f-3431-472a-8b44-1fa7a47e74e1","Type":"ContainerStarted","Data":"096b18475e111f907991fcf8b4c35fd1639933711f2350e395526b69020d1a3b"} Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.093046 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.102515 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.104084 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" event={"ID":"b5d6b394-fe97-4e70-9916-9c6791379931","Type":"ContainerStarted","Data":"0e5d451ed6f19cc61897d3f7bb18ffb4c079c4cf8f3f43c29e185da3fc567834"} Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.104771 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.112201 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-qgvqn" podStartSLOduration=4.356536869 podStartE2EDuration="40.112174586s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.767414133 +0000 UTC m=+1118.755769350" lastFinishedPulling="2025-12-03 14:32:27.52305185 +0000 UTC m=+1154.511407067" observedRunningTime="2025-12-03 14:32:30.110713136 +0000 UTC m=+1157.099068353" watchObservedRunningTime="2025-12-03 14:32:30.112174586 +0000 UTC m=+1157.100529823" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.119892 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" event={"ID":"5434f233-b204-4db9-a93d-93d4342e4514","Type":"ContainerStarted","Data":"4c5aed350db83db4689181c3faf6a05c11ac00026678508a44749c514c8ee34d"} Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.120214 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.126943 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.143638 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" podStartSLOduration=4.325820524 podStartE2EDuration="40.143618263s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.883865311 +0000 UTC m=+1118.872220528" lastFinishedPulling="2025-12-03 14:32:27.70166304 +0000 UTC m=+1154.690018267" observedRunningTime="2025-12-03 14:32:30.136728573 +0000 UTC m=+1157.125083800" watchObservedRunningTime="2025-12-03 14:32:30.143618263 +0000 UTC m=+1157.131973480" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.162521 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" event={"ID":"2f31e262-8f03-4689-bc29-5d9d8b33a2cc","Type":"ContainerStarted","Data":"f442075d7369c85c37cf7bfc16a3d0120e234517c839231582928c5ad11ece9c"} Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.162972 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.180471 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.184190 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" podStartSLOduration=8.682473498 podStartE2EDuration="40.18417098s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.766456427 +0000 UTC m=+1118.754811644" lastFinishedPulling="2025-12-03 14:32:23.268153909 +0000 UTC m=+1150.256509126" observedRunningTime="2025-12-03 14:32:30.180116299 +0000 UTC m=+1157.168471526" watchObservedRunningTime="2025-12-03 14:32:30.18417098 +0000 UTC m=+1157.172526207" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.208470 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" event={"ID":"cbafd52a-d603-4b8c-a056-9a2a749bee21","Type":"ContainerStarted","Data":"b50c1545812def999f0b42b6bee1e4b05c647aedcb0d61f8aa9fb1fc4fbe6b57"} Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.223249 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" podStartSLOduration=3.34937093 podStartE2EDuration="41.223230047s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.411997389 +0000 UTC m=+1118.400352606" lastFinishedPulling="2025-12-03 14:32:29.285856506 +0000 UTC m=+1156.274211723" observedRunningTime="2025-12-03 14:32:30.220131041 +0000 UTC m=+1157.208486268" watchObservedRunningTime="2025-12-03 14:32:30.223230047 +0000 UTC m=+1157.211585264" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.324947 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" podStartSLOduration=2.947822001 podStartE2EDuration="40.324923149s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.76004726 +0000 UTC m=+1118.748402477" lastFinishedPulling="2025-12-03 14:32:29.137148408 +0000 UTC m=+1156.125503625" observedRunningTime="2025-12-03 14:32:30.318765099 +0000 UTC m=+1157.307120316" watchObservedRunningTime="2025-12-03 14:32:30.324923149 +0000 UTC m=+1157.313278366" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.346611 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" podStartSLOduration=3.648409111 podStartE2EDuration="41.346590346s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.605248885 +0000 UTC m=+1118.593604102" lastFinishedPulling="2025-12-03 14:32:29.30343012 +0000 UTC m=+1156.291785337" observedRunningTime="2025-12-03 14:32:30.341770333 +0000 UTC m=+1157.330125560" watchObservedRunningTime="2025-12-03 14:32:30.346590346 +0000 UTC m=+1157.334945563" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.371221 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podStartSLOduration=3.194107361 podStartE2EDuration="41.371197074s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.353640871 +0000 UTC m=+1118.341996088" lastFinishedPulling="2025-12-03 14:32:29.530730584 +0000 UTC m=+1156.519085801" observedRunningTime="2025-12-03 14:32:30.367689588 +0000 UTC m=+1157.356044845" watchObservedRunningTime="2025-12-03 14:32:30.371197074 +0000 UTC m=+1157.359552291" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.470580 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" podStartSLOduration=36.284053136 podStartE2EDuration="40.470558352s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:32:23.271155732 +0000 UTC m=+1150.259510949" lastFinishedPulling="2025-12-03 14:32:27.457660958 +0000 UTC m=+1154.446016165" observedRunningTime="2025-12-03 14:32:30.458927732 +0000 UTC m=+1157.447282969" watchObservedRunningTime="2025-12-03 14:32:30.470558352 +0000 UTC m=+1157.458913579" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.493778 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-m4q9z" podStartSLOduration=4.766329138 podStartE2EDuration="41.493757062s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:50.974230036 +0000 UTC m=+1117.962585253" lastFinishedPulling="2025-12-03 14:32:27.70165796 +0000 UTC m=+1154.690013177" observedRunningTime="2025-12-03 14:32:30.488433925 +0000 UTC m=+1157.476789152" watchObservedRunningTime="2025-12-03 14:32:30.493757062 +0000 UTC m=+1157.482112279" Dec 03 14:32:30 crc kubenswrapper[4751]: I1203 14:32:30.522151 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" podStartSLOduration=5.150704789 podStartE2EDuration="41.522129993s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.330309478 +0000 UTC m=+1118.318664695" lastFinishedPulling="2025-12-03 14:32:27.701734682 +0000 UTC m=+1154.690089899" observedRunningTime="2025-12-03 14:32:30.517683811 +0000 UTC m=+1157.506039028" watchObservedRunningTime="2025-12-03 14:32:30.522129993 +0000 UTC m=+1157.510485210" Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.266515 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" event={"ID":"5c3add92-6cee-4980-903f-692cfd4cf87c","Type":"ContainerStarted","Data":"a45d50d561a39de70071005f45b1c58232f7854f0905bcb7901b8bf2b3552b02"} Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.266773 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.272151 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" event={"ID":"cbafd52a-d603-4b8c-a056-9a2a749bee21","Type":"ContainerStarted","Data":"57112e8a99e5c73802fc7f7e5d9ad361b1e655e81c230e87be87834b029d64e0"} Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.272400 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.275566 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" event={"ID":"b4cb50e3-a93e-49b0-ac9c-6551046dc0be","Type":"ContainerStarted","Data":"ce68e20bd3fa16ecbdffdef202dff6889d417fda894383ed0a398df3a163feb6"} Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.276815 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" event={"ID":"26689286-a791-485e-b442-9e399ae7a79b","Type":"ContainerStarted","Data":"c426ccf6ef9e946b08771051d506522a133533f4c218effd892125750fb44426"} Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.277589 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.282543 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" event={"ID":"4cd34243-8404-4cf7-9185-c012700b5814","Type":"ContainerStarted","Data":"5be1475584bb3cef43d9de2948139ef96b13585efad3a94796faf69451e80e97"} Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.298182 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" podStartSLOduration=2.416415346 podStartE2EDuration="41.298164947s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.58948339 +0000 UTC m=+1118.577838607" lastFinishedPulling="2025-12-03 14:32:30.471232991 +0000 UTC m=+1157.459588208" observedRunningTime="2025-12-03 14:32:31.293622642 +0000 UTC m=+1158.281977859" watchObservedRunningTime="2025-12-03 14:32:31.298164947 +0000 UTC m=+1158.286520164" Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.335753 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" podStartSLOduration=4.019038864 podStartE2EDuration="42.335737713s" podCreationTimestamp="2025-12-03 14:31:49 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.329751723 +0000 UTC m=+1118.318106940" lastFinishedPulling="2025-12-03 14:32:29.646450572 +0000 UTC m=+1156.634805789" observedRunningTime="2025-12-03 14:32:31.318791386 +0000 UTC m=+1158.307146603" watchObservedRunningTime="2025-12-03 14:32:31.335737713 +0000 UTC m=+1158.324092930" Dec 03 14:32:31 crc kubenswrapper[4751]: I1203 14:32:31.348484 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" podStartSLOduration=2.108770229 podStartE2EDuration="41.348453213s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.633463202 +0000 UTC m=+1118.621818419" lastFinishedPulling="2025-12-03 14:32:30.873146186 +0000 UTC m=+1157.861501403" observedRunningTime="2025-12-03 14:32:31.342102878 +0000 UTC m=+1158.330458105" watchObservedRunningTime="2025-12-03 14:32:31.348453213 +0000 UTC m=+1158.336808430" Dec 03 14:32:32 crc kubenswrapper[4751]: I1203 14:32:32.294076 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-ppb75" Dec 03 14:32:32 crc kubenswrapper[4751]: I1203 14:32:32.871158 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7f9b9ccb84-v5t4x" Dec 03 14:32:33 crc kubenswrapper[4751]: I1203 14:32:33.296147 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" event={"ID":"ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d","Type":"ContainerStarted","Data":"7b657640a7807ce4db6bddc92cfd683e207118f0b5355a60a3e5b1f42ff366bf"} Dec 03 14:32:33 crc kubenswrapper[4751]: I1203 14:32:33.314934 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4thwz" podStartSLOduration=2.471754072 podStartE2EDuration="43.314910222s" podCreationTimestamp="2025-12-03 14:31:50 +0000 UTC" firstStartedPulling="2025-12-03 14:31:51.898589007 +0000 UTC m=+1118.886944214" lastFinishedPulling="2025-12-03 14:32:32.741745147 +0000 UTC m=+1159.730100364" observedRunningTime="2025-12-03 14:32:33.31159334 +0000 UTC m=+1160.299948567" watchObservedRunningTime="2025-12-03 14:32:33.314910222 +0000 UTC m=+1160.303265479" Dec 03 14:32:36 crc kubenswrapper[4751]: I1203 14:32:36.402528 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.015373 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-7422h" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.119760 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-6shnx" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.233240 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.346970 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qswcx" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.432735 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wgjr8" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.545619 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4nvpk" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.610915 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-sxl98" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.664105 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7tsfx" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.708015 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.827923 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k5j8b" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.893714 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-gsp46" Dec 03 14:32:40 crc kubenswrapper[4751]: I1203 14:32:40.949552 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-9598fff97-l6gxb" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.132489 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7xrr"] Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.134719 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.136315 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.136468 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.136548 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.136702 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5nwd5" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.157988 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7xrr"] Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.237349 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7b6hx"] Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.238940 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.249620 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7b6hx"] Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.251048 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.317440 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-config\") pod \"dnsmasq-dns-675f4bcbfc-w7xrr\" (UID: \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.317588 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92zn\" (UniqueName: \"kubernetes.io/projected/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-kube-api-access-x92zn\") pod \"dnsmasq-dns-675f4bcbfc-w7xrr\" (UID: \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.419372 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x92zn\" (UniqueName: \"kubernetes.io/projected/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-kube-api-access-x92zn\") pod \"dnsmasq-dns-675f4bcbfc-w7xrr\" (UID: \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.419448 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7b6hx\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.419472 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-config\") pod \"dnsmasq-dns-78dd6ddcc-7b6hx\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.419518 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6qgq\" (UniqueName: \"kubernetes.io/projected/9f787117-ce7c-4015-9c05-c682ca5db682-kube-api-access-c6qgq\") pod \"dnsmasq-dns-78dd6ddcc-7b6hx\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.419549 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-config\") pod \"dnsmasq-dns-675f4bcbfc-w7xrr\" (UID: \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.420384 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-config\") pod \"dnsmasq-dns-675f4bcbfc-w7xrr\" (UID: \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.442082 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92zn\" (UniqueName: \"kubernetes.io/projected/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-kube-api-access-x92zn\") pod \"dnsmasq-dns-675f4bcbfc-w7xrr\" (UID: \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.458683 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.520893 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7b6hx\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.520953 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-config\") pod \"dnsmasq-dns-78dd6ddcc-7b6hx\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.521009 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6qgq\" (UniqueName: \"kubernetes.io/projected/9f787117-ce7c-4015-9c05-c682ca5db682-kube-api-access-c6qgq\") pod \"dnsmasq-dns-78dd6ddcc-7b6hx\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.522543 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7b6hx\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.523219 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-config\") pod \"dnsmasq-dns-78dd6ddcc-7b6hx\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.538128 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6qgq\" (UniqueName: \"kubernetes.io/projected/9f787117-ce7c-4015-9c05-c682ca5db682-kube-api-access-c6qgq\") pod \"dnsmasq-dns-78dd6ddcc-7b6hx\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.603837 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:32:56 crc kubenswrapper[4751]: I1203 14:32:56.896858 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7xrr"] Dec 03 14:32:57 crc kubenswrapper[4751]: I1203 14:32:57.031034 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7b6hx"] Dec 03 14:32:57 crc kubenswrapper[4751]: W1203 14:32:57.033884 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f787117_ce7c_4015_9c05_c682ca5db682.slice/crio-34d6d25b397f915ce7257f74940b7adb86160a32c87aebb6f79602401b065141 WatchSource:0}: Error finding container 34d6d25b397f915ce7257f74940b7adb86160a32c87aebb6f79602401b065141: Status 404 returned error can't find the container with id 34d6d25b397f915ce7257f74940b7adb86160a32c87aebb6f79602401b065141 Dec 03 14:32:57 crc kubenswrapper[4751]: I1203 14:32:57.485344 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" event={"ID":"9f787117-ce7c-4015-9c05-c682ca5db682","Type":"ContainerStarted","Data":"34d6d25b397f915ce7257f74940b7adb86160a32c87aebb6f79602401b065141"} Dec 03 14:32:57 crc kubenswrapper[4751]: I1203 14:32:57.486147 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" event={"ID":"064f0ca5-6cc9-414a-8c00-b7cd04c897e6","Type":"ContainerStarted","Data":"ad557b0120ebe400593e7008dcf85cd657fa8a2af84c69643522562fdbec9621"} Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.270385 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7xrr"] Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.350130 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv6pt"] Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.351346 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.358284 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv6pt"] Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.470761 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-config\") pod \"dnsmasq-dns-666b6646f7-lv6pt\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.470796 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lv6pt\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.470822 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2rts\" (UniqueName: \"kubernetes.io/projected/72619134-e92c-4c33-8e15-c01c0774d887-kube-api-access-m2rts\") pod \"dnsmasq-dns-666b6646f7-lv6pt\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.571751 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-config\") pod \"dnsmasq-dns-666b6646f7-lv6pt\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.571816 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lv6pt\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.571851 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2rts\" (UniqueName: \"kubernetes.io/projected/72619134-e92c-4c33-8e15-c01c0774d887-kube-api-access-m2rts\") pod \"dnsmasq-dns-666b6646f7-lv6pt\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.573148 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-config\") pod \"dnsmasq-dns-666b6646f7-lv6pt\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.573833 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lv6pt\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.624360 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2rts\" (UniqueName: \"kubernetes.io/projected/72619134-e92c-4c33-8e15-c01c0774d887-kube-api-access-m2rts\") pod \"dnsmasq-dns-666b6646f7-lv6pt\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.682270 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.707541 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7b6hx"] Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.753946 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ntkbm"] Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.755573 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.765092 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ntkbm"] Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.876503 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ntkbm\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.876668 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c84xt\" (UniqueName: \"kubernetes.io/projected/ab23361b-079c-45ea-84a1-ee39f33d8578-kube-api-access-c84xt\") pod \"dnsmasq-dns-57d769cc4f-ntkbm\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.876712 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-config\") pod \"dnsmasq-dns-57d769cc4f-ntkbm\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.978431 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c84xt\" (UniqueName: \"kubernetes.io/projected/ab23361b-079c-45ea-84a1-ee39f33d8578-kube-api-access-c84xt\") pod \"dnsmasq-dns-57d769cc4f-ntkbm\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.978493 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-config\") pod \"dnsmasq-dns-57d769cc4f-ntkbm\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.978541 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ntkbm\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.980179 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-config\") pod \"dnsmasq-dns-57d769cc4f-ntkbm\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:32:59 crc kubenswrapper[4751]: I1203 14:32:59.981106 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ntkbm\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.014550 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c84xt\" (UniqueName: \"kubernetes.io/projected/ab23361b-079c-45ea-84a1-ee39f33d8578-kube-api-access-c84xt\") pod \"dnsmasq-dns-57d769cc4f-ntkbm\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.152838 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.362303 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv6pt"] Dec 03 14:33:00 crc kubenswrapper[4751]: W1203 14:33:00.389915 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72619134_e92c_4c33_8e15_c01c0774d887.slice/crio-b65eba5277f39ba60d000b7426feed8fcc62feae56b8e4501b64956cf239ae4e WatchSource:0}: Error finding container b65eba5277f39ba60d000b7426feed8fcc62feae56b8e4501b64956cf239ae4e: Status 404 returned error can't find the container with id b65eba5277f39ba60d000b7426feed8fcc62feae56b8e4501b64956cf239ae4e Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.518477 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.519858 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.524116 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.524722 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.525423 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.525506 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4kzr6" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.525771 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.528927 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.530386 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.530541 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.540579 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" event={"ID":"72619134-e92c-4c33-8e15-c01c0774d887","Type":"ContainerStarted","Data":"b65eba5277f39ba60d000b7426feed8fcc62feae56b8e4501b64956cf239ae4e"} Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688352 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688395 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz4z2\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-kube-api-access-bz4z2\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688458 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688551 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688583 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688634 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b63367-ad69-4428-9c79-8eee86b817ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688656 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b63367-ad69-4428-9c79-8eee86b817ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688672 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688716 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688739 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.688796 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.690853 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ntkbm"] Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.789923 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790378 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790541 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b63367-ad69-4428-9c79-8eee86b817ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790594 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b63367-ad69-4428-9c79-8eee86b817ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790626 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790670 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790722 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790764 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790807 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790849 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz4z2\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-kube-api-access-bz4z2\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.790910 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.791140 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.791179 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.792222 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.793115 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.793804 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.793907 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db3758ae0881ee46f2f95476f1e867b818829daba4b9b83c7181ebfa4809f516/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.795742 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.798670 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.799422 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b63367-ad69-4428-9c79-8eee86b817ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.800166 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b63367-ad69-4428-9c79-8eee86b817ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.815292 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.822853 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz4z2\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-kube-api-access-bz4z2\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.836371 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.840299 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"rabbitmq-server-0\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.840747 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.846622 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.846843 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.847086 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.847256 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-df6ct" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.847440 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.848918 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.851025 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.866080 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 14:33:00 crc kubenswrapper[4751]: I1203 14:33:00.879749 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006530 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006586 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006636 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1da8e9b-0799-4327-9e24-216c4a51fde2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006668 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006690 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006734 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006761 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006879 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006955 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn2xq\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-kube-api-access-zn2xq\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.006985 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1da8e9b-0799-4327-9e24-216c4a51fde2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.007013 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.109885 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.110998 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.111045 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.111646 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.112259 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn2xq\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-kube-api-access-zn2xq\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.112394 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1da8e9b-0799-4327-9e24-216c4a51fde2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.112487 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.112555 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.112598 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.112702 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1da8e9b-0799-4327-9e24-216c4a51fde2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.113750 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.113785 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.114710 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.115049 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.115116 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ee16707d3681dd52a4653a253214830682d6f3bd45ca60a3e117a974c1854fca/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.115147 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.116611 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.116986 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1da8e9b-0799-4327-9e24-216c4a51fde2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.117126 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.117285 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.118874 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1da8e9b-0799-4327-9e24-216c4a51fde2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.128262 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.131397 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn2xq\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-kube-api-access-zn2xq\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.148676 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:01 crc kubenswrapper[4751]: I1203 14:33:01.210963 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.097758 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.107044 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.109437 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.109929 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.110668 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.113015 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-vxp7v" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.116752 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.118971 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.231993 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a45965be-01f0-4c6d-9db8-08b5e5564c5a-kolla-config\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.232059 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45965be-01f0-4c6d-9db8-08b5e5564c5a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.232171 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a45965be-01f0-4c6d-9db8-08b5e5564c5a-config-data-default\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.232242 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ad0999c2-8295-435d-aa20-40875f1e5b1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad0999c2-8295-435d-aa20-40875f1e5b1c\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.232338 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a45965be-01f0-4c6d-9db8-08b5e5564c5a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.232470 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz7v2\" (UniqueName: \"kubernetes.io/projected/a45965be-01f0-4c6d-9db8-08b5e5564c5a-kube-api-access-fz7v2\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.232813 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a45965be-01f0-4c6d-9db8-08b5e5564c5a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.232846 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45965be-01f0-4c6d-9db8-08b5e5564c5a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.334305 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a45965be-01f0-4c6d-9db8-08b5e5564c5a-config-data-default\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.334377 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ad0999c2-8295-435d-aa20-40875f1e5b1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad0999c2-8295-435d-aa20-40875f1e5b1c\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.334397 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a45965be-01f0-4c6d-9db8-08b5e5564c5a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.334428 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz7v2\" (UniqueName: \"kubernetes.io/projected/a45965be-01f0-4c6d-9db8-08b5e5564c5a-kube-api-access-fz7v2\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.334462 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a45965be-01f0-4c6d-9db8-08b5e5564c5a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.334481 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45965be-01f0-4c6d-9db8-08b5e5564c5a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.334501 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a45965be-01f0-4c6d-9db8-08b5e5564c5a-kolla-config\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.334518 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45965be-01f0-4c6d-9db8-08b5e5564c5a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.335801 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45965be-01f0-4c6d-9db8-08b5e5564c5a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.336507 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a45965be-01f0-4c6d-9db8-08b5e5564c5a-config-data-default\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.337804 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a45965be-01f0-4c6d-9db8-08b5e5564c5a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.341418 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.341454 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a45965be-01f0-4c6d-9db8-08b5e5564c5a-kolla-config\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.341465 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ad0999c2-8295-435d-aa20-40875f1e5b1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad0999c2-8295-435d-aa20-40875f1e5b1c\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3c6667fd376e2d22e879b54492f2fbdb5877e367336a63c989dea540e17df71d/globalmount\"" pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.356641 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz7v2\" (UniqueName: \"kubernetes.io/projected/a45965be-01f0-4c6d-9db8-08b5e5564c5a-kube-api-access-fz7v2\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.356740 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a45965be-01f0-4c6d-9db8-08b5e5564c5a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.357064 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45965be-01f0-4c6d-9db8-08b5e5564c5a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.390536 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ad0999c2-8295-435d-aa20-40875f1e5b1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad0999c2-8295-435d-aa20-40875f1e5b1c\") pod \"openstack-galera-0\" (UID: \"a45965be-01f0-4c6d-9db8-08b5e5564c5a\") " pod="openstack/openstack-galera-0" Dec 03 14:33:02 crc kubenswrapper[4751]: I1203 14:33:02.432931 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.507402 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.509560 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.512154 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.512881 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.513119 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.515072 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2b4zr" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.527551 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.678224 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fb88\" (UniqueName: \"kubernetes.io/projected/3dc63449-cac9-48bc-abb7-3ff350a408cf-kube-api-access-6fb88\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.678535 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc63449-cac9-48bc-abb7-3ff350a408cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.678559 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-28dc59fd-03bf-4a6d-8571-d474ef03ac00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28dc59fd-03bf-4a6d-8571-d474ef03ac00\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.678590 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc63449-cac9-48bc-abb7-3ff350a408cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.678641 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3dc63449-cac9-48bc-abb7-3ff350a408cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.678658 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3dc63449-cac9-48bc-abb7-3ff350a408cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.678675 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3dc63449-cac9-48bc-abb7-3ff350a408cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.678689 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dc63449-cac9-48bc-abb7-3ff350a408cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.779823 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3dc63449-cac9-48bc-abb7-3ff350a408cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.779869 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3dc63449-cac9-48bc-abb7-3ff350a408cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.779909 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3dc63449-cac9-48bc-abb7-3ff350a408cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.779927 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dc63449-cac9-48bc-abb7-3ff350a408cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.780478 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3dc63449-cac9-48bc-abb7-3ff350a408cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.780685 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3dc63449-cac9-48bc-abb7-3ff350a408cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.780561 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fb88\" (UniqueName: \"kubernetes.io/projected/3dc63449-cac9-48bc-abb7-3ff350a408cf-kube-api-access-6fb88\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.780853 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc63449-cac9-48bc-abb7-3ff350a408cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.780876 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-28dc59fd-03bf-4a6d-8571-d474ef03ac00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28dc59fd-03bf-4a6d-8571-d474ef03ac00\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.780903 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc63449-cac9-48bc-abb7-3ff350a408cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.781001 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3dc63449-cac9-48bc-abb7-3ff350a408cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.781769 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dc63449-cac9-48bc-abb7-3ff350a408cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.787581 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc63449-cac9-48bc-abb7-3ff350a408cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.789230 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc63449-cac9-48bc-abb7-3ff350a408cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.798816 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.798876 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-28dc59fd-03bf-4a6d-8571-d474ef03ac00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28dc59fd-03bf-4a6d-8571-d474ef03ac00\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/163bf00e2c10ef6ee65265f1779846f4e47cb7cbef3141a019febf9ae2e290d8/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.803175 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fb88\" (UniqueName: \"kubernetes.io/projected/3dc63449-cac9-48bc-abb7-3ff350a408cf-kube-api-access-6fb88\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.832951 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-28dc59fd-03bf-4a6d-8571-d474ef03ac00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28dc59fd-03bf-4a6d-8571-d474ef03ac00\") pod \"openstack-cell1-galera-0\" (UID: \"3dc63449-cac9-48bc-abb7-3ff350a408cf\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.931076 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.932163 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.938258 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qz9x2" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.938532 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.939063 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 14:33:03 crc kubenswrapper[4751]: I1203 14:33:03.958547 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.089666 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwgkv\" (UniqueName: \"kubernetes.io/projected/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-kube-api-access-gwgkv\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.089727 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-kolla-config\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.089757 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-memcached-tls-certs\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.089815 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-combined-ca-bundle\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.089851 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-config-data\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.127970 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.190943 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwgkv\" (UniqueName: \"kubernetes.io/projected/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-kube-api-access-gwgkv\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.191008 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-kolla-config\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.191064 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-memcached-tls-certs\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.191114 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-combined-ca-bundle\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.191145 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-config-data\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.192035 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-kolla-config\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.192135 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-config-data\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.194359 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-memcached-tls-certs\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.195263 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-combined-ca-bundle\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.208780 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwgkv\" (UniqueName: \"kubernetes.io/projected/05d18e1b-04cd-4b4a-a728-bdbc9c2ab713-kube-api-access-gwgkv\") pod \"memcached-0\" (UID: \"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713\") " pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.251921 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 14:33:04 crc kubenswrapper[4751]: W1203 14:33:04.386496 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab23361b_079c_45ea_84a1_ee39f33d8578.slice/crio-d6c8cd3f428edc104ed5ff84531d35174b9325d70d51e74cd95fe841bfb6e9b4 WatchSource:0}: Error finding container d6c8cd3f428edc104ed5ff84531d35174b9325d70d51e74cd95fe841bfb6e9b4: Status 404 returned error can't find the container with id d6c8cd3f428edc104ed5ff84531d35174b9325d70d51e74cd95fe841bfb6e9b4 Dec 03 14:33:04 crc kubenswrapper[4751]: I1203 14:33:04.597724 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" event={"ID":"ab23361b-079c-45ea-84a1-ee39f33d8578","Type":"ContainerStarted","Data":"d6c8cd3f428edc104ed5ff84531d35174b9325d70d51e74cd95fe841bfb6e9b4"} Dec 03 14:33:05 crc kubenswrapper[4751]: I1203 14:33:05.860718 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:33:05 crc kubenswrapper[4751]: I1203 14:33:05.861930 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:33:05 crc kubenswrapper[4751]: I1203 14:33:05.869157 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-h4t4n" Dec 03 14:33:05 crc kubenswrapper[4751]: I1203 14:33:05.890302 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:33:05 crc kubenswrapper[4751]: I1203 14:33:05.931158 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbsbh\" (UniqueName: \"kubernetes.io/projected/80f132c3-7b27-4d3d-950e-9c6aa887b6a7-kube-api-access-pbsbh\") pod \"kube-state-metrics-0\" (UID: \"80f132c3-7b27-4d3d-950e-9c6aa887b6a7\") " pod="openstack/kube-state-metrics-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.036318 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbsbh\" (UniqueName: \"kubernetes.io/projected/80f132c3-7b27-4d3d-950e-9c6aa887b6a7-kube-api-access-pbsbh\") pod \"kube-state-metrics-0\" (UID: \"80f132c3-7b27-4d3d-950e-9c6aa887b6a7\") " pod="openstack/kube-state-metrics-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.072224 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbsbh\" (UniqueName: \"kubernetes.io/projected/80f132c3-7b27-4d3d-950e-9c6aa887b6a7-kube-api-access-pbsbh\") pod \"kube-state-metrics-0\" (UID: \"80f132c3-7b27-4d3d-950e-9c6aa887b6a7\") " pod="openstack/kube-state-metrics-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.185149 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.708012 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.711838 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.713842 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.715984 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.716054 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.716201 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.716431 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-9lgsg" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.723890 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.755567 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a8a7a20c-88be-4cca-a10d-8ac9a898f090-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.755635 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8a7a20c-88be-4cca-a10d-8ac9a898f090-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.755762 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8a7a20c-88be-4cca-a10d-8ac9a898f090-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.755792 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8a7a20c-88be-4cca-a10d-8ac9a898f090-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.755819 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcr9c\" (UniqueName: \"kubernetes.io/projected/a8a7a20c-88be-4cca-a10d-8ac9a898f090-kube-api-access-pcr9c\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.755843 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a8a7a20c-88be-4cca-a10d-8ac9a898f090-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.755984 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a8a7a20c-88be-4cca-a10d-8ac9a898f090-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.857881 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a8a7a20c-88be-4cca-a10d-8ac9a898f090-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.858008 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a8a7a20c-88be-4cca-a10d-8ac9a898f090-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.858039 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8a7a20c-88be-4cca-a10d-8ac9a898f090-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.858114 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8a7a20c-88be-4cca-a10d-8ac9a898f090-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.858142 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8a7a20c-88be-4cca-a10d-8ac9a898f090-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.858165 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcr9c\" (UniqueName: \"kubernetes.io/projected/a8a7a20c-88be-4cca-a10d-8ac9a898f090-kube-api-access-pcr9c\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.858193 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a8a7a20c-88be-4cca-a10d-8ac9a898f090-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.858729 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a8a7a20c-88be-4cca-a10d-8ac9a898f090-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.870555 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8a7a20c-88be-4cca-a10d-8ac9a898f090-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.871926 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8a7a20c-88be-4cca-a10d-8ac9a898f090-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.876437 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8a7a20c-88be-4cca-a10d-8ac9a898f090-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.879234 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcr9c\" (UniqueName: \"kubernetes.io/projected/a8a7a20c-88be-4cca-a10d-8ac9a898f090-kube-api-access-pcr9c\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.881216 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a8a7a20c-88be-4cca-a10d-8ac9a898f090-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:06 crc kubenswrapper[4751]: I1203 14:33:06.882937 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a8a7a20c-88be-4cca-a10d-8ac9a898f090-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a8a7a20c-88be-4cca-a10d-8ac9a898f090\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.032312 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.168624 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.170776 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.172574 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.173284 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.173584 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.174083 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.174134 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8skrs" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.174083 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.183221 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.268191 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.268229 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.268277 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.268315 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-config\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.268364 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9de5857f-8fe8-48e3-991b-7171fc510567-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.268391 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9de5857f-8fe8-48e3-991b-7171fc510567-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.268427 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.268445 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9jnj\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-kube-api-access-d9jnj\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.369829 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9de5857f-8fe8-48e3-991b-7171fc510567-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.369900 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9de5857f-8fe8-48e3-991b-7171fc510567-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.369952 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.369976 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9jnj\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-kube-api-access-d9jnj\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.370015 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.370031 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.370068 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.370106 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-config\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.372028 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9de5857f-8fe8-48e3-991b-7171fc510567-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.372789 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.372830 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cb164b736cb7b8b7ab8aad6339ced870201e734e0ebc8bfeab076a5d319160df/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.373512 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-config\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.373670 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.375338 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.376104 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9de5857f-8fe8-48e3-991b-7171fc510567-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.377132 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.397183 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9jnj\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-kube-api-access-d9jnj\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.409151 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") pod \"prometheus-metric-storage-0\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:07 crc kubenswrapper[4751]: I1203 14:33:07.490248 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.402111 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lqzrd"] Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.403672 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.406055 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fmftx" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.406443 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.406476 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.412880 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lqzrd"] Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.487880 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dzz9c"] Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.504412 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.505786 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dzz9c"] Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.505903 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ab1fa90-b8eb-405d-803d-b9fd84939289-var-log-ovn\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.507597 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab1fa90-b8eb-405d-803d-b9fd84939289-combined-ca-bundle\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.507686 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ab1fa90-b8eb-405d-803d-b9fd84939289-scripts\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.507727 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pzpz\" (UniqueName: \"kubernetes.io/projected/7ab1fa90-b8eb-405d-803d-b9fd84939289-kube-api-access-8pzpz\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.508044 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ab1fa90-b8eb-405d-803d-b9fd84939289-var-run-ovn\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.508091 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ab1fa90-b8eb-405d-803d-b9fd84939289-var-run\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.509038 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab1fa90-b8eb-405d-803d-b9fd84939289-ovn-controller-tls-certs\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611190 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ab1fa90-b8eb-405d-803d-b9fd84939289-scripts\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611249 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pzpz\" (UniqueName: \"kubernetes.io/projected/7ab1fa90-b8eb-405d-803d-b9fd84939289-kube-api-access-8pzpz\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611283 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-var-lib\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611380 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-var-log\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611403 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ab1fa90-b8eb-405d-803d-b9fd84939289-var-run-ovn\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611422 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ab1fa90-b8eb-405d-803d-b9fd84939289-var-run\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611442 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab1fa90-b8eb-405d-803d-b9fd84939289-ovn-controller-tls-certs\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611511 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-var-run\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611543 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ab1fa90-b8eb-405d-803d-b9fd84939289-var-log-ovn\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611560 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3faee7be-8b53-42b6-90fd-ba62998f9ced-scripts\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611579 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4fx\" (UniqueName: \"kubernetes.io/projected/3faee7be-8b53-42b6-90fd-ba62998f9ced-kube-api-access-dq4fx\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611604 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-etc-ovs\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611619 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab1fa90-b8eb-405d-803d-b9fd84939289-combined-ca-bundle\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611868 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ab1fa90-b8eb-405d-803d-b9fd84939289-var-run-ovn\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.611968 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ab1fa90-b8eb-405d-803d-b9fd84939289-var-run\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.612036 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ab1fa90-b8eb-405d-803d-b9fd84939289-var-log-ovn\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.613470 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ab1fa90-b8eb-405d-803d-b9fd84939289-scripts\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.627579 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab1fa90-b8eb-405d-803d-b9fd84939289-combined-ca-bundle\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.628946 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab1fa90-b8eb-405d-803d-b9fd84939289-ovn-controller-tls-certs\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.629876 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pzpz\" (UniqueName: \"kubernetes.io/projected/7ab1fa90-b8eb-405d-803d-b9fd84939289-kube-api-access-8pzpz\") pod \"ovn-controller-lqzrd\" (UID: \"7ab1fa90-b8eb-405d-803d-b9fd84939289\") " pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.713115 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-var-lib\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.713202 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-var-log\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.713267 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-var-run\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.713294 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3faee7be-8b53-42b6-90fd-ba62998f9ced-scripts\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.713309 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4fx\" (UniqueName: \"kubernetes.io/projected/3faee7be-8b53-42b6-90fd-ba62998f9ced-kube-api-access-dq4fx\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.713350 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-etc-ovs\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.713742 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-etc-ovs\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.713909 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-var-lib\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.714064 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-var-log\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.714130 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3faee7be-8b53-42b6-90fd-ba62998f9ced-var-run\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.717628 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3faee7be-8b53-42b6-90fd-ba62998f9ced-scripts\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.734243 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.736525 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4fx\" (UniqueName: \"kubernetes.io/projected/3faee7be-8b53-42b6-90fd-ba62998f9ced-kube-api-access-dq4fx\") pod \"ovn-controller-ovs-dzz9c\" (UID: \"3faee7be-8b53-42b6-90fd-ba62998f9ced\") " pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:09 crc kubenswrapper[4751]: I1203 14:33:09.822854 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.329948 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.331884 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.338407 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-7vk9f" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.338612 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.338764 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.338915 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.339137 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.345594 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.442630 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw5z8\" (UniqueName: \"kubernetes.io/projected/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-kube-api-access-gw5z8\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.442697 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5125db0b-d450-4d03-8780-c1bdb036e47a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5125db0b-d450-4d03-8780-c1bdb036e47a\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.442726 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-config\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.442749 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.442811 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.442838 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.442868 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.442883 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.543988 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.544041 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.544077 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.544116 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.544612 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.544151 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw5z8\" (UniqueName: \"kubernetes.io/projected/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-kube-api-access-gw5z8\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.544985 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5125db0b-d450-4d03-8780-c1bdb036e47a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5125db0b-d450-4d03-8780-c1bdb036e47a\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.545014 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-config\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.545038 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.546689 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.546892 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-config\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.550790 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.551050 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.552482 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.552520 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5125db0b-d450-4d03-8780-c1bdb036e47a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5125db0b-d450-4d03-8780-c1bdb036e47a\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/48c0955a5a726f52003f0bd91772f0d985806da1355eb878babb33a94fa2f975/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.567571 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.568009 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw5z8\" (UniqueName: \"kubernetes.io/projected/c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b-kube-api-access-gw5z8\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.607487 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5125db0b-d450-4d03-8780-c1bdb036e47a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5125db0b-d450-4d03-8780-c1bdb036e47a\") pod \"ovsdbserver-nb-0\" (UID: \"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:11 crc kubenswrapper[4751]: I1203 14:33:11.656691 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.216268 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.217553 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.219928 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.219998 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.219927 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.220099 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-5ncps" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.220155 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.239939 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.314664 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/e2d4448e-9181-494b-bec0-12da338b184d-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.314725 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e2d4448e-9181-494b-bec0-12da338b184d-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.314971 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d4448e-9181-494b-bec0-12da338b184d-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.315038 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d4448e-9181-494b-bec0-12da338b184d-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.315094 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b947\" (UniqueName: \"kubernetes.io/projected/e2d4448e-9181-494b-bec0-12da338b184d-kube-api-access-7b947\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.399747 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-548665d79b-8226l"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.400816 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.403194 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.408809 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.408948 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.418169 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-548665d79b-8226l"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.419072 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d4448e-9181-494b-bec0-12da338b184d-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.419111 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d4448e-9181-494b-bec0-12da338b184d-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.419163 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b947\" (UniqueName: \"kubernetes.io/projected/e2d4448e-9181-494b-bec0-12da338b184d-kube-api-access-7b947\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.419227 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/e2d4448e-9181-494b-bec0-12da338b184d-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.419260 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e2d4448e-9181-494b-bec0-12da338b184d-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.420900 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d4448e-9181-494b-bec0-12da338b184d-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.425435 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d4448e-9181-494b-bec0-12da338b184d-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.450781 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/e2d4448e-9181-494b-bec0-12da338b184d-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.453978 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e2d4448e-9181-494b-bec0-12da338b184d-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.461700 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b947\" (UniqueName: \"kubernetes.io/projected/e2d4448e-9181-494b-bec0-12da338b184d-kube-api-access-7b947\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-xg9ml\" (UID: \"e2d4448e-9181-494b-bec0-12da338b184d\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.511725 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.512841 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.516287 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.516539 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.527791 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.527858 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6spx\" (UniqueName: \"kubernetes.io/projected/4797e85e-ad67-454b-b210-25f5481780c5-kube-api-access-q6spx\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.527981 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.528010 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.528038 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.528092 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4797e85e-ad67-454b-b210-25f5481780c5-config\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.530401 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.560259 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629504 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6spx\" (UniqueName: \"kubernetes.io/projected/4797e85e-ad67-454b-b210-25f5481780c5-kube-api-access-q6spx\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629562 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629599 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629655 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629676 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629698 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629715 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629755 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4797e85e-ad67-454b-b210-25f5481780c5-config\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629783 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629803 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.629838 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wz2j\" (UniqueName: \"kubernetes.io/projected/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-kube-api-access-8wz2j\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.631203 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.633200 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4797e85e-ad67-454b-b210-25f5481780c5-config\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.636054 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.643926 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.647532 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/4797e85e-ad67-454b-b210-25f5481780c5-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.655798 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.656914 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.659659 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-h6mhz" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.660117 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6spx\" (UniqueName: \"kubernetes.io/projected/4797e85e-ad67-454b-b210-25f5481780c5-kube-api-access-q6spx\") pod \"cloudkitty-lokistack-querier-548665d79b-8226l\" (UID: \"4797e85e-ad67-454b-b210-25f5481780c5\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.661073 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.661213 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.661376 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.661488 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.661643 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.661748 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.678621 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.692155 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.693413 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.707834 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b"] Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.731238 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.731287 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.731361 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.731413 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.731440 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wz2j\" (UniqueName: \"kubernetes.io/projected/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-kube-api-access-8wz2j\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.733936 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.735077 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.736283 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.739985 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.776522 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wz2j\" (UniqueName: \"kubernetes.io/projected/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-kube-api-access-8wz2j\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.776899 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-r2j44\" (UID: \"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.834656 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.834827 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.834909 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.834962 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835000 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835057 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835088 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835159 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835188 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835224 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835273 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835293 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835370 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835400 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2shx\" (UniqueName: \"kubernetes.io/projected/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-kube-api-access-p2shx\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835428 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835465 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835532 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxvgf\" (UniqueName: \"kubernetes.io/projected/5a964492-a736-427e-b81a-d6d863d0eaaf-kube-api-access-dxvgf\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.835560 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.853306 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.936688 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.936750 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.936809 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937358 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937390 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2shx\" (UniqueName: \"kubernetes.io/projected/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-kube-api-access-p2shx\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937413 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937441 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937480 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxvgf\" (UniqueName: \"kubernetes.io/projected/5a964492-a736-427e-b81a-d6d863d0eaaf-kube-api-access-dxvgf\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937518 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937539 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937567 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937624 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937648 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937669 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937696 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937714 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937714 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937749 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937769 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.937934 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.938365 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.939006 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.939012 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.939188 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.939270 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: E1203 14:33:14.939309 4751 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Dec 03 14:33:14 crc kubenswrapper[4751]: E1203 14:33:14.939371 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-tls-secret podName:5a964492-a736-427e-b81a-d6d863d0eaaf nodeName:}" failed. No retries permitted until 2025-12-03 14:33:15.439357621 +0000 UTC m=+1202.427712838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-tls-secret") pod "cloudkitty-lokistack-gateway-76cc998948-8d88b" (UID: "5a964492-a736-427e-b81a-d6d863d0eaaf") : secret "cloudkitty-lokistack-gateway-http" not found Dec 03 14:33:14 crc kubenswrapper[4751]: E1203 14:33:14.939538 4751 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Dec 03 14:33:14 crc kubenswrapper[4751]: E1203 14:33:14.939567 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-tls-secret podName:c1c24fdf-0c9e-458f-9803-87e9d6c3161f nodeName:}" failed. No retries permitted until 2025-12-03 14:33:15.439559946 +0000 UTC m=+1202.427915153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-tls-secret") pod "cloudkitty-lokistack-gateway-76cc998948-bl8ws" (UID: "c1c24fdf-0c9e-458f-9803-87e9d6c3161f") : secret "cloudkitty-lokistack-gateway-http" not found Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.940219 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.940538 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.940851 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.942266 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.942673 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a964492-a736-427e-b81a-d6d863d0eaaf-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.942825 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.945044 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.956218 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxvgf\" (UniqueName: \"kubernetes.io/projected/5a964492-a736-427e-b81a-d6d863d0eaaf-kube-api-access-dxvgf\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:14 crc kubenswrapper[4751]: I1203 14:33:14.961989 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2shx\" (UniqueName: \"kubernetes.io/projected/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-kube-api-access-p2shx\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.122448 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.146587 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.153500 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.154042 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ndfzl" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.154305 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.154547 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.181104 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.247403 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0046f111-cf94-402b-8981-659978aace04-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.247474 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7g88\" (UniqueName: \"kubernetes.io/projected/0046f111-cf94-402b-8981-659978aace04-kube-api-access-v7g88\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.247555 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0046f111-cf94-402b-8981-659978aace04-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.247579 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0046f111-cf94-402b-8981-659978aace04-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.247797 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8a0bacdb-c71d-4efc-a9d4-0966bec4d143\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a0bacdb-c71d-4efc-a9d4-0966bec4d143\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.247843 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0046f111-cf94-402b-8981-659978aace04-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.247889 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0046f111-cf94-402b-8981-659978aace04-config\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.247916 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0046f111-cf94-402b-8981-659978aace04-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.349697 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0046f111-cf94-402b-8981-659978aace04-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.349742 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0046f111-cf94-402b-8981-659978aace04-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.349827 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8a0bacdb-c71d-4efc-a9d4-0966bec4d143\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a0bacdb-c71d-4efc-a9d4-0966bec4d143\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.349876 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0046f111-cf94-402b-8981-659978aace04-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.350928 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0046f111-cf94-402b-8981-659978aace04-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.349906 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0046f111-cf94-402b-8981-659978aace04-config\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.351212 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0046f111-cf94-402b-8981-659978aace04-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.351244 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0046f111-cf94-402b-8981-659978aace04-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.351726 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7g88\" (UniqueName: \"kubernetes.io/projected/0046f111-cf94-402b-8981-659978aace04-kube-api-access-v7g88\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.351996 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0046f111-cf94-402b-8981-659978aace04-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.352507 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.352575 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8a0bacdb-c71d-4efc-a9d4-0966bec4d143\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a0bacdb-c71d-4efc-a9d4-0966bec4d143\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/888240d3eb73325e838936168ccdf6e9489ad0942dacc6398085670879abec6f/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.354083 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0046f111-cf94-402b-8981-659978aace04-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.354223 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0046f111-cf94-402b-8981-659978aace04-config\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.355914 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0046f111-cf94-402b-8981-659978aace04-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.356156 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0046f111-cf94-402b-8981-659978aace04-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.373707 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7g88\" (UniqueName: \"kubernetes.io/projected/0046f111-cf94-402b-8981-659978aace04-kube-api-access-v7g88\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.390922 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.391956 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.393006 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8a0bacdb-c71d-4efc-a9d4-0966bec4d143\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a0bacdb-c71d-4efc-a9d4-0966bec4d143\") pod \"ovsdbserver-sb-0\" (UID: \"0046f111-cf94-402b-8981-659978aace04\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.396688 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.396940 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.403842 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.453693 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.453828 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.456962 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/5a964492-a736-427e-b81a-d6d863d0eaaf-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-8d88b\" (UID: \"5a964492-a736-427e-b81a-d6d863d0eaaf\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.457253 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c1c24fdf-0c9e-458f-9803-87e9d6c3161f-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-bl8ws\" (UID: \"c1c24fdf-0c9e-458f-9803-87e9d6c3161f\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.482311 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.489449 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.491238 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.496000 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.501003 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.528292 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.555467 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.555517 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.555551 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.555649 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.555831 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.555953 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6xk5\" (UniqueName: \"kubernetes.io/projected/04053d51-dddf-43e3-a230-9ac729dec435-kube-api-access-x6xk5\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.555989 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.556013 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04053d51-dddf-43e3-a230-9ac729dec435-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.616835 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.622802 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.628156 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.633432 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.636436 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.636653 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.636658 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.657682 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.657760 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.657809 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6xk5\" (UniqueName: \"kubernetes.io/projected/04053d51-dddf-43e3-a230-9ac729dec435-kube-api-access-x6xk5\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.657842 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.657869 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04053d51-dddf-43e3-a230-9ac729dec435-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.657903 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.657950 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.657980 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.658008 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.658043 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.658070 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mvt6\" (UniqueName: \"kubernetes.io/projected/34351ff3-ea5e-403c-9d04-ca6777287cff-kube-api-access-4mvt6\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.658103 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.658150 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.658182 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34351ff3-ea5e-403c-9d04-ca6777287cff-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.658229 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.658416 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.658454 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.661594 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.661866 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04053d51-dddf-43e3-a230-9ac729dec435-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.677592 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6xk5\" (UniqueName: \"kubernetes.io/projected/04053d51-dddf-43e3-a230-9ac729dec435-kube-api-access-x6xk5\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.679517 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.680561 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.680878 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/04053d51-dddf-43e3-a230-9ac729dec435-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.688206 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.697412 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"04053d51-dddf-43e3-a230-9ac729dec435\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.731379 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760093 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760165 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760182 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mvt6\" (UniqueName: \"kubernetes.io/projected/34351ff3-ea5e-403c-9d04-ca6777287cff-kube-api-access-4mvt6\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760204 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760257 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34351ff3-ea5e-403c-9d04-ca6777287cff-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760275 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760306 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85623735-2d6a-4d53-ac14-e4cd714ecc7b-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760338 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44h8q\" (UniqueName: \"kubernetes.io/projected/85623735-2d6a-4d53-ac14-e4cd714ecc7b-kube-api-access-44h8q\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760361 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760381 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760408 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760436 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760483 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760504 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.760643 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.761547 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.762662 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34351ff3-ea5e-403c-9d04-ca6777287cff-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.764567 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.765136 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.765792 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/34351ff3-ea5e-403c-9d04-ca6777287cff-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.778958 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mvt6\" (UniqueName: \"kubernetes.io/projected/34351ff3-ea5e-403c-9d04-ca6777287cff-kube-api-access-4mvt6\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.798139 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"34351ff3-ea5e-403c-9d04-ca6777287cff\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.832794 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.865455 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.865794 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.865913 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85623735-2d6a-4d53-ac14-e4cd714ecc7b-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.866017 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44h8q\" (UniqueName: \"kubernetes.io/projected/85623735-2d6a-4d53-ac14-e4cd714ecc7b-kube-api-access-44h8q\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.866126 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.866248 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.866417 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.867630 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.867835 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.869061 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85623735-2d6a-4d53-ac14-e4cd714ecc7b-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.877866 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.881096 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.886156 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/85623735-2d6a-4d53-ac14-e4cd714ecc7b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.914763 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44h8q\" (UniqueName: \"kubernetes.io/projected/85623735-2d6a-4d53-ac14-e4cd714ecc7b-kube-api-access-44h8q\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.946662 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"85623735-2d6a-4d53-ac14-e4cd714ecc7b\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:15 crc kubenswrapper[4751]: I1203 14:33:15.970714 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.281380 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.282114 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x92zn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-w7xrr_openstack(064f0ca5-6cc9-414a-8c00-b7cd04c897e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.283370 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" podUID="064f0ca5-6cc9-414a-8c00-b7cd04c897e6" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.304432 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.304856 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2rts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-lv6pt_openstack(72619134-e92c-4c33-8e15-c01c0774d887): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.306086 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" podUID="72619134-e92c-4c33-8e15-c01c0774d887" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.336299 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.336484 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c84xt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-ntkbm_openstack(ab23361b-079c-45ea-84a1-ee39f33d8578): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.337576 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" podUID="ab23361b-079c-45ea-84a1-ee39f33d8578" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.342544 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.342691 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c6qgq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-7b6hx_openstack(9f787117-ce7c-4015-9c05-c682ca5db682): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.345111 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" podUID="9f787117-ce7c-4015-9c05-c682ca5db682" Dec 03 14:33:25 crc kubenswrapper[4751]: I1203 14:33:25.677142 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 14:33:25 crc kubenswrapper[4751]: I1203 14:33:25.891693 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3dc63449-cac9-48bc-abb7-3ff350a408cf","Type":"ContainerStarted","Data":"64fc1db55a01a63c01a96c8f6a5b7a3f4d6ec5505c17ba09511073e326f8790f"} Dec 03 14:33:25 crc kubenswrapper[4751]: I1203 14:33:25.911034 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.941537 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" podUID="ab23361b-079c-45ea-84a1-ee39f33d8578" Dec 03 14:33:25 crc kubenswrapper[4751]: E1203 14:33:25.941599 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" podUID="72619134-e92c-4c33-8e15-c01c0774d887" Dec 03 14:33:26 crc kubenswrapper[4751]: W1203 14:33:26.112454 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda45965be_01f0_4c6d_9db8_08b5e5564c5a.slice/crio-72bad347b1f5fe6a2f16004fc251412c0bbfea98a1d647ccd37e62b5d67a0c9c WatchSource:0}: Error finding container 72bad347b1f5fe6a2f16004fc251412c0bbfea98a1d647ccd37e62b5d67a0c9c: Status 404 returned error can't find the container with id 72bad347b1f5fe6a2f16004fc251412c0bbfea98a1d647ccd37e62b5d67a0c9c Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.671740 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.731734 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 14:33:26 crc kubenswrapper[4751]: W1203 14:33:26.736087 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1da8e9b_0799_4327_9e24_216c4a51fde2.slice/crio-716307fec7da66fb79e65eeee577d64d532055f9324eb1c9e6b495b8e85e94df WatchSource:0}: Error finding container 716307fec7da66fb79e65eeee577d64d532055f9324eb1c9e6b495b8e85e94df: Status 404 returned error can't find the container with id 716307fec7da66fb79e65eeee577d64d532055f9324eb1c9e6b495b8e85e94df Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.745543 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.752751 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.808020 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.823518 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.835949 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.868381 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-dns-svc\") pod \"9f787117-ce7c-4015-9c05-c682ca5db682\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.868545 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-config\") pod \"9f787117-ce7c-4015-9c05-c682ca5db682\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.868606 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x92zn\" (UniqueName: \"kubernetes.io/projected/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-kube-api-access-x92zn\") pod \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\" (UID: \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\") " Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.868657 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6qgq\" (UniqueName: \"kubernetes.io/projected/9f787117-ce7c-4015-9c05-c682ca5db682-kube-api-access-c6qgq\") pod \"9f787117-ce7c-4015-9c05-c682ca5db682\" (UID: \"9f787117-ce7c-4015-9c05-c682ca5db682\") " Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.868703 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-config\") pod \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\" (UID: \"064f0ca5-6cc9-414a-8c00-b7cd04c897e6\") " Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.868970 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f787117-ce7c-4015-9c05-c682ca5db682" (UID: "9f787117-ce7c-4015-9c05-c682ca5db682"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.869589 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-config" (OuterVolumeSpecName: "config") pod "9f787117-ce7c-4015-9c05-c682ca5db682" (UID: "9f787117-ce7c-4015-9c05-c682ca5db682"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.869749 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-config" (OuterVolumeSpecName: "config") pod "064f0ca5-6cc9-414a-8c00-b7cd04c897e6" (UID: "064f0ca5-6cc9-414a-8c00-b7cd04c897e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.874505 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-kube-api-access-x92zn" (OuterVolumeSpecName: "kube-api-access-x92zn") pod "064f0ca5-6cc9-414a-8c00-b7cd04c897e6" (UID: "064f0ca5-6cc9-414a-8c00-b7cd04c897e6"). InnerVolumeSpecName "kube-api-access-x92zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.875060 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f787117-ce7c-4015-9c05-c682ca5db682-kube-api-access-c6qgq" (OuterVolumeSpecName: "kube-api-access-c6qgq") pod "9f787117-ce7c-4015-9c05-c682ca5db682" (UID: "9f787117-ce7c-4015-9c05-c682ca5db682"). InnerVolumeSpecName "kube-api-access-c6qgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.899518 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"80f132c3-7b27-4d3d-950e-9c6aa887b6a7","Type":"ContainerStarted","Data":"17ecc36cc88f6b75871140183dc5b60c8a66fa6f3f7bbf503e944444b8215309"} Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.900690 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713","Type":"ContainerStarted","Data":"437e8005404fa66b9c7723b17c51222f046da554f9644b58b1fee216c51bfbaf"} Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.901689 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1da8e9b-0799-4327-9e24-216c4a51fde2","Type":"ContainerStarted","Data":"716307fec7da66fb79e65eeee577d64d532055f9324eb1c9e6b495b8e85e94df"} Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.903010 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerStarted","Data":"335fa198823f6346aec8e3791f5248ac12a0eafe2f8d7c0122d9dc8c51cf8839"} Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.905569 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" event={"ID":"064f0ca5-6cc9-414a-8c00-b7cd04c897e6","Type":"ContainerDied","Data":"ad557b0120ebe400593e7008dcf85cd657fa8a2af84c69643522562fdbec9621"} Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.905625 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-w7xrr" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.911304 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a45965be-01f0-4c6d-9db8-08b5e5564c5a","Type":"ContainerStarted","Data":"72bad347b1f5fe6a2f16004fc251412c0bbfea98a1d647ccd37e62b5d67a0c9c"} Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.913234 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.913248 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7b6hx" event={"ID":"9f787117-ce7c-4015-9c05-c682ca5db682","Type":"ContainerDied","Data":"34d6d25b397f915ce7257f74940b7adb86160a32c87aebb6f79602401b065141"} Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.916053 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0046f111-cf94-402b-8981-659978aace04","Type":"ContainerStarted","Data":"59b91db206acc159ec4b9e9b736c6533da35c26dac76815c7811d643bc19cf29"} Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.971180 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.971206 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x92zn\" (UniqueName: \"kubernetes.io/projected/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-kube-api-access-x92zn\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.971216 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6qgq\" (UniqueName: \"kubernetes.io/projected/9f787117-ce7c-4015-9c05-c682ca5db682-kube-api-access-c6qgq\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.971311 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064f0ca5-6cc9-414a-8c00-b7cd04c897e6-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.971340 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f787117-ce7c-4015-9c05-c682ca5db682-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.976464 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7b6hx"] Dec 03 14:33:26 crc kubenswrapper[4751]: I1203 14:33:26.984069 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7b6hx"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.017446 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7xrr"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.024024 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7xrr"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.346132 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="064f0ca5-6cc9-414a-8c00-b7cd04c897e6" path="/var/lib/kubelet/pods/064f0ca5-6cc9-414a-8c00-b7cd04c897e6/volumes" Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.346516 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f787117-ce7c-4015-9c05-c682ca5db682" path="/var/lib/kubelet/pods/9f787117-ce7c-4015-9c05-c682ca5db682/volumes" Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.346850 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.347581 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 14:33:27 crc kubenswrapper[4751]: W1203 14:33:27.370342 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2d4448e_9181_494b_bec0_12da338b184d.slice/crio-dce23f3a80842e6b5ce93d38d3c47b82394a2c5116c5684d378d3bc23c71fce1 WatchSource:0}: Error finding container dce23f3a80842e6b5ce93d38d3c47b82394a2c5116c5684d378d3bc23c71fce1: Status 404 returned error can't find the container with id dce23f3a80842e6b5ce93d38d3c47b82394a2c5116c5684d378d3bc23c71fce1 Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.383774 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.389855 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.397861 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lqzrd"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.409808 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.416651 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.432824 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.438532 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-548665d79b-8226l"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.497374 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.512968 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws"] Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.610522 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 14:33:27 crc kubenswrapper[4751]: W1203 14:33:27.835145 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ab1fa90_b8eb_405d_803d_b9fd84939289.slice/crio-cadc90e2a3a3c2e6495823c6934031879e192e734e8108401ef8dae63930518b WatchSource:0}: Error finding container cadc90e2a3a3c2e6495823c6934031879e192e734e8108401ef8dae63930518b: Status 404 returned error can't find the container with id cadc90e2a3a3c2e6495823c6934031879e192e734e8108401ef8dae63930518b Dec 03 14:33:27 crc kubenswrapper[4751]: W1203 14:33:27.856920 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1c24fdf_0c9e_458f_9803_87e9d6c3161f.slice/crio-efbc3f05675260db5863769c1cb9523e492eb4afece6e2300780b35f090276f8 WatchSource:0}: Error finding container efbc3f05675260db5863769c1cb9523e492eb4afece6e2300780b35f090276f8: Status 404 returned error can't find the container with id efbc3f05675260db5863769c1cb9523e492eb4afece6e2300780b35f090276f8 Dec 03 14:33:27 crc kubenswrapper[4751]: W1203 14:33:27.861540 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04053d51_dddf_43e3_a230_9ac729dec435.slice/crio-949e854481ae74ed5e9362f8381e325fb7af180762ea44a46abf9d45aae8285d WatchSource:0}: Error finding container 949e854481ae74ed5e9362f8381e325fb7af180762ea44a46abf9d45aae8285d: Status 404 returned error can't find the container with id 949e854481ae74ed5e9362f8381e325fb7af180762ea44a46abf9d45aae8285d Dec 03 14:33:27 crc kubenswrapper[4751]: W1203 14:33:27.865709 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0a4db08_d9ca_4d0a_aad6_33dd6f300c3b.slice/crio-60d40e486130cec04162ff9fa9b69280eeb7a361ca8cb1c31a1944fd6412bed5 WatchSource:0}: Error finding container 60d40e486130cec04162ff9fa9b69280eeb7a361ca8cb1c31a1944fd6412bed5: Status 404 returned error can't find the container with id 60d40e486130cec04162ff9fa9b69280eeb7a361ca8cb1c31a1944fd6412bed5 Dec 03 14:33:27 crc kubenswrapper[4751]: E1203 14:33:27.866139 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-ingester,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51,Command:[],Args:[-target=ingester -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:wal,ReadOnly:false,MountPath:/tmp/wal,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6xk5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-ingester-0_openstack(04053d51-dddf-43e3-a230-9ac729dec435): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:33:27 crc kubenswrapper[4751]: E1203 14:33:27.867668 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="04053d51-dddf-43e3-a230-9ac729dec435" Dec 03 14:33:27 crc kubenswrapper[4751]: E1203 14:33:27.870342 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc4h7fh646hd4h76h558h8h68dhfchfh598h597h687h5b7hb5h649h5cbhc9h65ch57ch659h554h678h678hb5h5dh85h567h58bh7h5b8h679q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gw5z8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:33:27 crc kubenswrapper[4751]: E1203 14:33:27.873001 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:nc4h7fh646hd4h76h558h8h68dhfchfh598h597h687h5b7hb5h649h5cbhc9h65ch57ch659h554h678h678hb5h5dh85h567h58bh7h5b8h679q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gw5z8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:33:27 crc kubenswrapper[4751]: E1203 14:33:27.874102 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack/ovsdbserver-nb-0" podUID="c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b" Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.933545 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8a7a20c-88be-4cca-a10d-8ac9a898f090","Type":"ContainerStarted","Data":"f9696dd23dbc05b3c52fd6d553e2390f60c53f35b92f8ea0e036ffaabdc73846"} Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.935472 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"04053d51-dddf-43e3-a230-9ac729dec435","Type":"ContainerStarted","Data":"949e854481ae74ed5e9362f8381e325fb7af180762ea44a46abf9d45aae8285d"} Dec 03 14:33:27 crc kubenswrapper[4751]: E1203 14:33:27.936964 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="04053d51-dddf-43e3-a230-9ac729dec435" Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.938718 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"34351ff3-ea5e-403c-9d04-ca6777287cff","Type":"ContainerStarted","Data":"b1562dcd6fb16990078934adcdc93b2691a314d5160fdf63ed3b6fbbd36d66c3"} Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.943596 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b","Type":"ContainerStarted","Data":"60d40e486130cec04162ff9fa9b69280eeb7a361ca8cb1c31a1944fd6412bed5"} Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.946179 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"47b63367-ad69-4428-9c79-8eee86b817ac","Type":"ContainerStarted","Data":"92cf114a0ab17f58ea03830a26be5b6cc88351062787af4415153fa0b39929da"} Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.948466 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" event={"ID":"4797e85e-ad67-454b-b210-25f5481780c5","Type":"ContainerStarted","Data":"401b7384e3881bcf7a93c26bc7c513f2bbb0b6e8536dda91ed5f675cd37ec72b"} Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.950772 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"85623735-2d6a-4d53-ac14-e4cd714ecc7b","Type":"ContainerStarted","Data":"498ba899b999a77e2763381124c4ef307af92c9e3e49dddb70fb0a14c4d5582e"} Dec 03 14:33:27 crc kubenswrapper[4751]: E1203 14:33:27.951148 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"]" pod="openstack/ovsdbserver-nb-0" podUID="c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b" Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.952237 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lqzrd" event={"ID":"7ab1fa90-b8eb-405d-803d-b9fd84939289","Type":"ContainerStarted","Data":"cadc90e2a3a3c2e6495823c6934031879e192e734e8108401ef8dae63930518b"} Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.954953 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" event={"ID":"5a964492-a736-427e-b81a-d6d863d0eaaf","Type":"ContainerStarted","Data":"f3cc92f82cb3582b73b2c1024272c8b4942e19f822d4ebb0ea07fdd059afff6a"} Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.956794 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" event={"ID":"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa","Type":"ContainerStarted","Data":"9d67eef393d021435b22314f10d1c5addda8e873263baf846ceb09046fad615d"} Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.972111 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" event={"ID":"c1c24fdf-0c9e-458f-9803-87e9d6c3161f","Type":"ContainerStarted","Data":"efbc3f05675260db5863769c1cb9523e492eb4afece6e2300780b35f090276f8"} Dec 03 14:33:27 crc kubenswrapper[4751]: I1203 14:33:27.973955 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" event={"ID":"e2d4448e-9181-494b-bec0-12da338b184d","Type":"ContainerStarted","Data":"dce23f3a80842e6b5ce93d38d3c47b82394a2c5116c5684d378d3bc23c71fce1"} Dec 03 14:33:28 crc kubenswrapper[4751]: I1203 14:33:28.684509 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dzz9c"] Dec 03 14:33:28 crc kubenswrapper[4751]: W1203 14:33:28.937407 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3faee7be_8b53_42b6_90fd_ba62998f9ced.slice/crio-82ca4d6da3f6657290ff87c57ced6bf644d13e4f6939af583a1121ed7073e175 WatchSource:0}: Error finding container 82ca4d6da3f6657290ff87c57ced6bf644d13e4f6939af583a1121ed7073e175: Status 404 returned error can't find the container with id 82ca4d6da3f6657290ff87c57ced6bf644d13e4f6939af583a1121ed7073e175 Dec 03 14:33:28 crc kubenswrapper[4751]: I1203 14:33:28.984229 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzz9c" event={"ID":"3faee7be-8b53-42b6-90fd-ba62998f9ced","Type":"ContainerStarted","Data":"82ca4d6da3f6657290ff87c57ced6bf644d13e4f6939af583a1121ed7073e175"} Dec 03 14:33:28 crc kubenswrapper[4751]: E1203 14:33:28.986169 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="04053d51-dddf-43e3-a230-9ac729dec435" Dec 03 14:33:28 crc kubenswrapper[4751]: E1203 14:33:28.986957 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"]" pod="openstack/ovsdbserver-nb-0" podUID="c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.492090 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-98hst"] Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.493641 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.495227 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.524870 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-98hst"] Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.598290 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa72a067-0544-4a0c-8750-c3d76221d4f2-ovn-rundir\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.598376 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa72a067-0544-4a0c-8750-c3d76221d4f2-ovs-rundir\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.598706 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa72a067-0544-4a0c-8750-c3d76221d4f2-config\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.598793 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa72a067-0544-4a0c-8750-c3d76221d4f2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.598840 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa72a067-0544-4a0c-8750-c3d76221d4f2-combined-ca-bundle\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.598955 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4465\" (UniqueName: \"kubernetes.io/projected/aa72a067-0544-4a0c-8750-c3d76221d4f2-kube-api-access-m4465\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.667440 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ntkbm"] Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.700349 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4465\" (UniqueName: \"kubernetes.io/projected/aa72a067-0544-4a0c-8750-c3d76221d4f2-kube-api-access-m4465\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.700437 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa72a067-0544-4a0c-8750-c3d76221d4f2-ovn-rundir\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.700484 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa72a067-0544-4a0c-8750-c3d76221d4f2-ovs-rundir\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.700546 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa72a067-0544-4a0c-8750-c3d76221d4f2-config\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.700613 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa72a067-0544-4a0c-8750-c3d76221d4f2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.700641 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa72a067-0544-4a0c-8750-c3d76221d4f2-combined-ca-bundle\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.701608 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa72a067-0544-4a0c-8750-c3d76221d4f2-ovs-rundir\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.701692 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa72a067-0544-4a0c-8750-c3d76221d4f2-ovn-rundir\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.702342 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa72a067-0544-4a0c-8750-c3d76221d4f2-config\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.707832 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-57s2g"] Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.708130 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa72a067-0544-4a0c-8750-c3d76221d4f2-combined-ca-bundle\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.709736 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.709756 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa72a067-0544-4a0c-8750-c3d76221d4f2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.713514 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.719719 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-57s2g"] Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.731444 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4465\" (UniqueName: \"kubernetes.io/projected/aa72a067-0544-4a0c-8750-c3d76221d4f2-kube-api-access-m4465\") pod \"ovn-controller-metrics-98hst\" (UID: \"aa72a067-0544-4a0c-8750-c3d76221d4f2\") " pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.803052 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbfwk\" (UniqueName: \"kubernetes.io/projected/a3682596-4b7c-48b0-abec-da171121c1f9-kube-api-access-kbfwk\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.803130 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.803171 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-config\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.803203 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.827731 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-98hst" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.911178 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-config\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.911240 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.911381 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbfwk\" (UniqueName: \"kubernetes.io/projected/a3682596-4b7c-48b0-abec-da171121c1f9-kube-api-access-kbfwk\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.911463 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.921251 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.953352 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbfwk\" (UniqueName: \"kubernetes.io/projected/a3682596-4b7c-48b0-abec-da171121c1f9-kube-api-access-kbfwk\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:33 crc kubenswrapper[4751]: I1203 14:33:33.956391 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-config\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.008659 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv6pt"] Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.086478 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-b8cts"] Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.088019 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.100665 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.127611 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-b8cts"] Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.218338 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q429k\" (UniqueName: \"kubernetes.io/projected/9658e996-cafd-4e95-8412-4a3405de5127-kube-api-access-q429k\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.218385 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.218407 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-dns-svc\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.218475 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-config\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.218667 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.309138 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-57s2g\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.320635 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-config\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.320757 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.320856 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q429k\" (UniqueName: \"kubernetes.io/projected/9658e996-cafd-4e95-8412-4a3405de5127-kube-api-access-q429k\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.320878 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.320899 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-dns-svc\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.321827 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-dns-svc\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.322049 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.322627 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-config\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.323198 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.344591 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q429k\" (UniqueName: \"kubernetes.io/projected/9658e996-cafd-4e95-8412-4a3405de5127-kube-api-access-q429k\") pod \"dnsmasq-dns-8554648995-b8cts\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.402863 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:34 crc kubenswrapper[4751]: I1203 14:33:34.459768 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:35 crc kubenswrapper[4751]: I1203 14:33:35.819883 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:33:35 crc kubenswrapper[4751]: I1203 14:33:35.820291 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:33:40 crc kubenswrapper[4751]: E1203 14:33:40.518078 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62" Dec 03 14:33:40 crc kubenswrapper[4751]: E1203 14:33:40.518535 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init-config-reloader,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,Command:[/bin/prometheus-config-reloader],Args:[--watch-interval=0 --listen-address=:8081 --config-file=/etc/prometheus/config/prometheus.yaml.gz --config-envsubst-file=/etc/prometheus/config_out/prometheus.env.yaml --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:reloader-init,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:SHARD,Value:0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/prometheus/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-out,ReadOnly:false,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d9jnj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(9de5857f-8fe8-48e3-991b-7171fc510567): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:33:40 crc kubenswrapper[4751]: E1203 14:33:40.520017 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" Dec 03 14:33:41 crc kubenswrapper[4751]: E1203 14:33:41.141943 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" Dec 03 14:33:43 crc kubenswrapper[4751]: E1203 14:33:43.547099 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 03 14:33:43 crc kubenswrapper[4751]: E1203 14:33:43.547478 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz4z2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(47b63367-ad69-4428-9c79-8eee86b817ac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:33:43 crc kubenswrapper[4751]: E1203 14:33:43.548841 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" Dec 03 14:33:43 crc kubenswrapper[4751]: E1203 14:33:43.596619 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 14:33:43 crc kubenswrapper[4751]: E1203 14:33:43.596829 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fb88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(3dc63449-cac9-48bc-abb7-3ff350a408cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:33:43 crc kubenswrapper[4751]: E1203 14:33:43.598089 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="3dc63449-cac9-48bc-abb7-3ff350a408cf" Dec 03 14:33:44 crc kubenswrapper[4751]: E1203 14:33:44.163856 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" Dec 03 14:33:44 crc kubenswrapper[4751]: E1203 14:33:44.164056 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="3dc63449-cac9-48bc-abb7-3ff350a408cf" Dec 03 14:33:45 crc kubenswrapper[4751]: E1203 14:33:45.215410 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-controller/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\": context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 03 14:33:45 crc kubenswrapper[4751]: E1203 14:33:45.215860 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n567h88h68bh544hc5h74h694h698h84h99h97h58hb9h65fhbh5d6h558h88h598h99h599hf6hbch59bh95h5b7h55dh58ch5d7h5b9hf9h57cq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8pzpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-lqzrd_openstack(7ab1fa90-b8eb-405d-803d-b9fd84939289): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-controller/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\": context canceled" logger="UnhandledError" Dec 03 14:33:45 crc kubenswrapper[4751]: E1203 14:33:45.217704 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \\\"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-controller/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\\\": context canceled\"" pod="openstack/ovn-controller-lqzrd" podUID="7ab1fa90-b8eb-405d-803d-b9fd84939289" Dec 03 14:33:45 crc kubenswrapper[4751]: E1203 14:33:45.237506 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 14:33:45 crc kubenswrapper[4751]: E1203 14:33:45.237676 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz7v2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(a45965be-01f0-4c6d-9db8-08b5e5564c5a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:33:45 crc kubenswrapper[4751]: E1203 14:33:45.239103 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="a45965be-01f0-4c6d-9db8-08b5e5564c5a" Dec 03 14:33:46 crc kubenswrapper[4751]: E1203 14:33:46.176504 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-lqzrd" podUID="7ab1fa90-b8eb-405d-803d-b9fd84939289" Dec 03 14:33:46 crc kubenswrapper[4751]: E1203 14:33:46.179105 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="a45965be-01f0-4c6d-9db8-08b5e5564c5a" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.435742 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.436466 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.436203 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.436895 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-query-frontend,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51,Command:[],Args:[-target=query-frontend -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wz2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-query-frontend-779849886d-r2j44_openstack(ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.436989 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-index-gateway,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51,Command:[],Args:[-target=index-gateway -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-index-gateway-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-index-gateway-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-44h8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-index-gateway-0_openstack(85623735-2d6a-4d53-ac14-e4cd714ecc7b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.437490 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-distributor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51,Command:[],Args:[-target=distributor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7b947,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-distributor-56cd74f89f-xg9ml_openstack(e2d4448e-9181-494b-bec0-12da338b184d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.438778 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" podUID="ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.438804 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" podUID="e2d4448e-9181-494b-bec0-12da338b184d" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.438841 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-index-gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="85623735-2d6a-4d53-ac14-e4cd714ecc7b" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.443145 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.443559 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-querier,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51,Command:[],Args:[-target=querier -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6spx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-querier-548665d79b-8226l_openstack(4797e85e-ad67-454b-b210-25f5481780c5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:33:48 crc kubenswrapper[4751]: E1203 14:33:48.445194 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" podUID="4797e85e-ad67-454b-b210-25f5481780c5" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.511580 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.524376 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.608018 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c84xt\" (UniqueName: \"kubernetes.io/projected/ab23361b-079c-45ea-84a1-ee39f33d8578-kube-api-access-c84xt\") pod \"ab23361b-079c-45ea-84a1-ee39f33d8578\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.608078 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2rts\" (UniqueName: \"kubernetes.io/projected/72619134-e92c-4c33-8e15-c01c0774d887-kube-api-access-m2rts\") pod \"72619134-e92c-4c33-8e15-c01c0774d887\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.608153 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-dns-svc\") pod \"72619134-e92c-4c33-8e15-c01c0774d887\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.608184 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-config\") pod \"ab23361b-079c-45ea-84a1-ee39f33d8578\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.608277 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-dns-svc\") pod \"ab23361b-079c-45ea-84a1-ee39f33d8578\" (UID: \"ab23361b-079c-45ea-84a1-ee39f33d8578\") " Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.608316 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-config\") pod \"72619134-e92c-4c33-8e15-c01c0774d887\" (UID: \"72619134-e92c-4c33-8e15-c01c0774d887\") " Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.609223 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-config" (OuterVolumeSpecName: "config") pod "72619134-e92c-4c33-8e15-c01c0774d887" (UID: "72619134-e92c-4c33-8e15-c01c0774d887"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.609284 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-config" (OuterVolumeSpecName: "config") pod "ab23361b-079c-45ea-84a1-ee39f33d8578" (UID: "ab23361b-079c-45ea-84a1-ee39f33d8578"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.609605 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "72619134-e92c-4c33-8e15-c01c0774d887" (UID: "72619134-e92c-4c33-8e15-c01c0774d887"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.609637 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab23361b-079c-45ea-84a1-ee39f33d8578" (UID: "ab23361b-079c-45ea-84a1-ee39f33d8578"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.614977 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab23361b-079c-45ea-84a1-ee39f33d8578-kube-api-access-c84xt" (OuterVolumeSpecName: "kube-api-access-c84xt") pod "ab23361b-079c-45ea-84a1-ee39f33d8578" (UID: "ab23361b-079c-45ea-84a1-ee39f33d8578"). InnerVolumeSpecName "kube-api-access-c84xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.615054 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72619134-e92c-4c33-8e15-c01c0774d887-kube-api-access-m2rts" (OuterVolumeSpecName: "kube-api-access-m2rts") pod "72619134-e92c-4c33-8e15-c01c0774d887" (UID: "72619134-e92c-4c33-8e15-c01c0774d887"). InnerVolumeSpecName "kube-api-access-m2rts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.710376 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.710413 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.710423 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c84xt\" (UniqueName: \"kubernetes.io/projected/ab23361b-079c-45ea-84a1-ee39f33d8578-kube-api-access-c84xt\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.710431 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2rts\" (UniqueName: \"kubernetes.io/projected/72619134-e92c-4c33-8e15-c01c0774d887-kube-api-access-m2rts\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.710439 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72619134-e92c-4c33-8e15-c01c0774d887-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:48 crc kubenswrapper[4751]: I1203 14:33:48.710447 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab23361b-079c-45ea-84a1-ee39f33d8578-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:33:49 crc kubenswrapper[4751]: I1203 14:33:49.202114 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" Dec 03 14:33:49 crc kubenswrapper[4751]: I1203 14:33:49.202134 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ntkbm" event={"ID":"ab23361b-079c-45ea-84a1-ee39f33d8578","Type":"ContainerDied","Data":"d6c8cd3f428edc104ed5ff84531d35174b9325d70d51e74cd95fe841bfb6e9b4"} Dec 03 14:33:49 crc kubenswrapper[4751]: I1203 14:33:49.204264 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.205289 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-index-gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="85623735-2d6a-4d53-ac14-e4cd714ecc7b" Dec 03 14:33:49 crc kubenswrapper[4751]: I1203 14:33:49.205947 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lv6pt" event={"ID":"72619134-e92c-4c33-8e15-c01c0774d887","Type":"ContainerDied","Data":"b65eba5277f39ba60d000b7426feed8fcc62feae56b8e4501b64956cf239ae4e"} Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.206562 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" podUID="4797e85e-ad67-454b-b210-25f5481780c5" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.207718 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" podUID="e2d4448e-9181-494b-bec0-12da338b184d" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.207953 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" podUID="ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa" Dec 03 14:33:49 crc kubenswrapper[4751]: I1203 14:33:49.334756 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv6pt"] Dec 03 14:33:49 crc kubenswrapper[4751]: I1203 14:33:49.334797 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv6pt"] Dec 03 14:33:49 crc kubenswrapper[4751]: I1203 14:33:49.367021 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ntkbm"] Dec 03 14:33:49 crc kubenswrapper[4751]: I1203 14:33:49.374505 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ntkbm"] Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.600526 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:710a1a5e486de5724469e55f29e9ff3f6cbef8cd4b2d21dfe254ede2b953c150" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.600856 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:710a1a5e486de5724469e55f29e9ff3f6cbef8cd4b2d21dfe254ede2b953c150,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p2shx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-76cc998948-bl8ws_openstack(c1c24fdf-0c9e-458f-9803-87e9d6c3161f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.602191 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" podUID="c1c24fdf-0c9e-458f-9803-87e9d6c3161f" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.662241 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.662594 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-compactor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51,Command:[],Args:[-target=compactor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4mvt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-compactor-0_openstack(34351ff3-ea5e-403c-9d04-ca6777287cff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.663837 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="34351ff3-ea5e-403c-9d04-ca6777287cff" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.963969 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Dec 03 14:33:49 crc kubenswrapper[4751]: E1203 14:33:49.964417 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f8h664h549h676h5ddh9ch66bh65dh5fch685h567h5f7h687hfh566h5b9h78hfbh659h5dbh696h5fbh65fh67fh579h5cchb4h59h5f4h5f4hb6h68dq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7g88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(0046f111-cf94-402b-8981-659978aace04): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:33:50 crc kubenswrapper[4751]: E1203 14:33:50.215797 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:710a1a5e486de5724469e55f29e9ff3f6cbef8cd4b2d21dfe254ede2b953c150\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" podUID="c1c24fdf-0c9e-458f-9803-87e9d6c3161f" Dec 03 14:33:50 crc kubenswrapper[4751]: E1203 14:33:50.217548 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="34351ff3-ea5e-403c-9d04-ca6777287cff" Dec 03 14:33:50 crc kubenswrapper[4751]: I1203 14:33:50.686040 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-57s2g"] Dec 03 14:33:50 crc kubenswrapper[4751]: E1203 14:33:50.902454 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 03 14:33:50 crc kubenswrapper[4751]: E1203 14:33:50.902706 4751 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 03 14:33:50 crc kubenswrapper[4751]: E1203 14:33:50.902888 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pbsbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(80f132c3-7b27-4d3d-950e-9c6aa887b6a7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:33:50 crc kubenswrapper[4751]: E1203 14:33:50.904364 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="80f132c3-7b27-4d3d-950e-9c6aa887b6a7" Dec 03 14:33:51 crc kubenswrapper[4751]: I1203 14:33:51.235694 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" event={"ID":"a3682596-4b7c-48b0-abec-da171121c1f9","Type":"ContainerStarted","Data":"b97dca4a2c82d777c0ba4f314e57e61d0dbab381eb3305ec15e8c4bb9de2a662"} Dec 03 14:33:51 crc kubenswrapper[4751]: E1203 14:33:51.240625 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="80f132c3-7b27-4d3d-950e-9c6aa887b6a7" Dec 03 14:33:51 crc kubenswrapper[4751]: I1203 14:33:51.330133 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72619134-e92c-4c33-8e15-c01c0774d887" path="/var/lib/kubelet/pods/72619134-e92c-4c33-8e15-c01c0774d887/volumes" Dec 03 14:33:51 crc kubenswrapper[4751]: I1203 14:33:51.330528 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab23361b-079c-45ea-84a1-ee39f33d8578" path="/var/lib/kubelet/pods/ab23361b-079c-45ea-84a1-ee39f33d8578/volumes" Dec 03 14:33:51 crc kubenswrapper[4751]: I1203 14:33:51.354873 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-b8cts"] Dec 03 14:33:51 crc kubenswrapper[4751]: W1203 14:33:51.359918 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9658e996_cafd_4e95_8412_4a3405de5127.slice/crio-28a06f798f01c11bbac36a622e4279d07cd8aa5f237f232dc5246a8d8301c341 WatchSource:0}: Error finding container 28a06f798f01c11bbac36a622e4279d07cd8aa5f237f232dc5246a8d8301c341: Status 404 returned error can't find the container with id 28a06f798f01c11bbac36a622e4279d07cd8aa5f237f232dc5246a8d8301c341 Dec 03 14:33:51 crc kubenswrapper[4751]: I1203 14:33:51.408669 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-98hst"] Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.246304 4751 generic.go:334] "Generic (PLEG): container finished" podID="9658e996-cafd-4e95-8412-4a3405de5127" containerID="7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20" exitCode=0 Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.246476 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-b8cts" event={"ID":"9658e996-cafd-4e95-8412-4a3405de5127","Type":"ContainerDied","Data":"7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20"} Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.246714 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-b8cts" event={"ID":"9658e996-cafd-4e95-8412-4a3405de5127","Type":"ContainerStarted","Data":"28a06f798f01c11bbac36a622e4279d07cd8aa5f237f232dc5246a8d8301c341"} Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.251020 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"04053d51-dddf-43e3-a230-9ac729dec435","Type":"ContainerStarted","Data":"13fae7e7eb67ed20d6b586b2bf5b92b54659f0ceefa56e6c0ac6f4558560df15"} Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.251646 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.253122 4751 generic.go:334] "Generic (PLEG): container finished" podID="a3682596-4b7c-48b0-abec-da171121c1f9" containerID="7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376" exitCode=0 Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.253221 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" event={"ID":"a3682596-4b7c-48b0-abec-da171121c1f9","Type":"ContainerDied","Data":"7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376"} Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.255358 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"05d18e1b-04cd-4b4a-a728-bdbc9c2ab713","Type":"ContainerStarted","Data":"1ef75f13be3331e7f42b43cf217d61e2b888176a9157213444a3af4830c613c9"} Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.255480 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.265733 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" event={"ID":"5a964492-a736-427e-b81a-d6d863d0eaaf","Type":"ContainerStarted","Data":"2bd2448a5316ba975d37c68c0eddcba328245be890c02a82d6455ac31b0a5e19"} Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.266154 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.280937 4751 generic.go:334] "Generic (PLEG): container finished" podID="3faee7be-8b53-42b6-90fd-ba62998f9ced" containerID="8a8681d89b09b975cd4de9f4ef74d6fba1f9f8356c8f424ea00015ea4d56057e" exitCode=0 Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.281030 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzz9c" event={"ID":"3faee7be-8b53-42b6-90fd-ba62998f9ced","Type":"ContainerDied","Data":"8a8681d89b09b975cd4de9f4ef74d6fba1f9f8356c8f424ea00015ea4d56057e"} Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.283283 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b","Type":"ContainerStarted","Data":"d124d32ccba1bd38170eb196a6381e44f7a2d24c40ddf598362cc55d330ecb66"} Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.289798 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=15.262055003 podStartE2EDuration="38.289779744s" podCreationTimestamp="2025-12-03 14:33:14 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.86591882 +0000 UTC m=+1214.854274037" lastFinishedPulling="2025-12-03 14:33:50.893643551 +0000 UTC m=+1237.881998778" observedRunningTime="2025-12-03 14:33:52.285391033 +0000 UTC m=+1239.273746280" watchObservedRunningTime="2025-12-03 14:33:52.289779744 +0000 UTC m=+1239.278134961" Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.292887 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-98hst" event={"ID":"aa72a067-0544-4a0c-8750-c3d76221d4f2","Type":"ContainerStarted","Data":"6837fe9c53a391fb0311a623cfba538e3d7fe083585ecbfa60873202ae8b6afd"} Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.310846 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" podStartSLOduration=15.25687101 podStartE2EDuration="38.310825704s" podCreationTimestamp="2025-12-03 14:33:14 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.839631126 +0000 UTC m=+1214.827986343" lastFinishedPulling="2025-12-03 14:33:50.89358582 +0000 UTC m=+1237.881941037" observedRunningTime="2025-12-03 14:33:52.301516497 +0000 UTC m=+1239.289871724" watchObservedRunningTime="2025-12-03 14:33:52.310825704 +0000 UTC m=+1239.299180921" Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.313595 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-8d88b" Dec 03 14:33:52 crc kubenswrapper[4751]: I1203 14:33:52.406700 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=26.225762362 podStartE2EDuration="49.406680115s" podCreationTimestamp="2025-12-03 14:33:03 +0000 UTC" firstStartedPulling="2025-12-03 14:33:26.762827313 +0000 UTC m=+1213.751182530" lastFinishedPulling="2025-12-03 14:33:49.943745066 +0000 UTC m=+1236.932100283" observedRunningTime="2025-12-03 14:33:52.366612631 +0000 UTC m=+1239.354967858" watchObservedRunningTime="2025-12-03 14:33:52.406680115 +0000 UTC m=+1239.395035332" Dec 03 14:33:53 crc kubenswrapper[4751]: I1203 14:33:53.307600 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1da8e9b-0799-4327-9e24-216c4a51fde2","Type":"ContainerStarted","Data":"5f35e9d06325ea3615762c01e04cf56308efaa687e0631b5a77396e8c64782f7"} Dec 03 14:33:55 crc kubenswrapper[4751]: I1203 14:33:55.367301 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8a7a20c-88be-4cca-a10d-8ac9a898f090","Type":"ContainerStarted","Data":"d413b98ba8fa260917836f4467fcbcf5e2d0719ced6a39881431b8bbdbf81154"} Dec 03 14:33:55 crc kubenswrapper[4751]: E1203 14:33:55.512591 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="0046f111-cf94-402b-8981-659978aace04" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.376666 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3dc63449-cac9-48bc-abb7-3ff350a408cf","Type":"ContainerStarted","Data":"0f96069925d709a650e9202cc05ffc71c5fd9b918ef225c25dfa185bc59f5fa0"} Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.379189 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-98hst" event={"ID":"aa72a067-0544-4a0c-8750-c3d76221d4f2","Type":"ContainerStarted","Data":"cbdbae48cbfe8c96dcbffa574f867dd607f30fe3e84739241c014e7498495e76"} Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.382523 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-b8cts" event={"ID":"9658e996-cafd-4e95-8412-4a3405de5127","Type":"ContainerStarted","Data":"482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202"} Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.383073 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.385219 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0046f111-cf94-402b-8981-659978aace04","Type":"ContainerStarted","Data":"a8c42c4d0f1d6bac5c89814eb0cf8aed6fa13eea66f8cc56abc747a21c313c8e"} Dec 03 14:33:56 crc kubenswrapper[4751]: E1203 14:33:56.387168 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="0046f111-cf94-402b-8981-659978aace04" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.387979 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" event={"ID":"a3682596-4b7c-48b0-abec-da171121c1f9","Type":"ContainerStarted","Data":"b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726"} Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.388141 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.390656 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzz9c" event={"ID":"3faee7be-8b53-42b6-90fd-ba62998f9ced","Type":"ContainerStarted","Data":"d6859a558086839fd5cfeb69697d587ce17bc115a0f388a08cea8169945a12e7"} Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.390683 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzz9c" event={"ID":"3faee7be-8b53-42b6-90fd-ba62998f9ced","Type":"ContainerStarted","Data":"754c01a3d29eae3ed6f3d2490c9f546f462aa8e5c6c53874ebca565ac57bd4f1"} Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.391161 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.391267 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.393451 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b","Type":"ContainerStarted","Data":"66b30ee835508e289e45357eef0f225944a5f2e5a1d74f99fbbc3418da1c919a"} Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.427316 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.111064172 podStartE2EDuration="46.427296149s" podCreationTimestamp="2025-12-03 14:33:10 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.870051484 +0000 UTC m=+1214.858406701" lastFinishedPulling="2025-12-03 14:33:55.186283461 +0000 UTC m=+1242.174638678" observedRunningTime="2025-12-03 14:33:56.419321239 +0000 UTC m=+1243.407676446" watchObservedRunningTime="2025-12-03 14:33:56.427296149 +0000 UTC m=+1243.415651366" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.447289 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-98hst" podStartSLOduration=19.666743652 podStartE2EDuration="23.447273219s" podCreationTimestamp="2025-12-03 14:33:33 +0000 UTC" firstStartedPulling="2025-12-03 14:33:51.419378519 +0000 UTC m=+1238.407733736" lastFinishedPulling="2025-12-03 14:33:55.199908096 +0000 UTC m=+1242.188263303" observedRunningTime="2025-12-03 14:33:56.44475361 +0000 UTC m=+1243.433108847" watchObservedRunningTime="2025-12-03 14:33:56.447273219 +0000 UTC m=+1243.435628436" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.467072 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" podStartSLOduration=22.989798704000002 podStartE2EDuration="23.467053715s" podCreationTimestamp="2025-12-03 14:33:33 +0000 UTC" firstStartedPulling="2025-12-03 14:33:50.9345902 +0000 UTC m=+1237.922945417" lastFinishedPulling="2025-12-03 14:33:51.411845201 +0000 UTC m=+1238.400200428" observedRunningTime="2025-12-03 14:33:56.461107461 +0000 UTC m=+1243.449462678" watchObservedRunningTime="2025-12-03 14:33:56.467053715 +0000 UTC m=+1243.455408932" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.487269 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-b8cts" podStartSLOduration=23.487244451 podStartE2EDuration="23.487244451s" podCreationTimestamp="2025-12-03 14:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:33:56.478566732 +0000 UTC m=+1243.466921959" watchObservedRunningTime="2025-12-03 14:33:56.487244451 +0000 UTC m=+1243.475599668" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.509487 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dzz9c" podStartSLOduration=26.032422382 podStartE2EDuration="47.509472593s" podCreationTimestamp="2025-12-03 14:33:09 +0000 UTC" firstStartedPulling="2025-12-03 14:33:28.94590615 +0000 UTC m=+1215.934261367" lastFinishedPulling="2025-12-03 14:33:50.422956361 +0000 UTC m=+1237.411311578" observedRunningTime="2025-12-03 14:33:56.509378681 +0000 UTC m=+1243.497733898" watchObservedRunningTime="2025-12-03 14:33:56.509472593 +0000 UTC m=+1243.497827870" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.657831 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.658082 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:56 crc kubenswrapper[4751]: I1203 14:33:56.700047 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:57 crc kubenswrapper[4751]: E1203 14:33:57.402257 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="0046f111-cf94-402b-8981-659978aace04" Dec 03 14:33:57 crc kubenswrapper[4751]: I1203 14:33:57.439682 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 14:33:58 crc kubenswrapper[4751]: I1203 14:33:58.442339 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerStarted","Data":"a8f3c53b4b5ec7dd11766cf6c91eb7017eb00024f50e0bb96b916819e86aea84"} Dec 03 14:33:59 crc kubenswrapper[4751]: I1203 14:33:59.253596 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 14:33:59 crc kubenswrapper[4751]: I1203 14:33:59.451617 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a45965be-01f0-4c6d-9db8-08b5e5564c5a","Type":"ContainerStarted","Data":"70b6d89a24f8431f7169261d896f69011d0e6ccc9f05845c1264d18a050cf01e"} Dec 03 14:33:59 crc kubenswrapper[4751]: I1203 14:33:59.454405 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lqzrd" event={"ID":"7ab1fa90-b8eb-405d-803d-b9fd84939289","Type":"ContainerStarted","Data":"62d659eb80be843f5aad1d58a6d13b1c7b044116e0fa08c4f6272a00186cfe79"} Dec 03 14:33:59 crc kubenswrapper[4751]: I1203 14:33:59.454679 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lqzrd" Dec 03 14:33:59 crc kubenswrapper[4751]: I1203 14:33:59.506396 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lqzrd" podStartSLOduration=20.082574957 podStartE2EDuration="50.506377317s" podCreationTimestamp="2025-12-03 14:33:09 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.841380594 +0000 UTC m=+1214.829735811" lastFinishedPulling="2025-12-03 14:33:58.265182944 +0000 UTC m=+1245.253538171" observedRunningTime="2025-12-03 14:33:59.501946175 +0000 UTC m=+1246.490301392" watchObservedRunningTime="2025-12-03 14:33:59.506377317 +0000 UTC m=+1246.494732534" Dec 03 14:34:00 crc kubenswrapper[4751]: E1203 14:34:00.218618 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a7a20c_88be_4cca_a10d_8ac9a898f090.slice/crio-d413b98ba8fa260917836f4467fcbcf5e2d0719ced6a39881431b8bbdbf81154.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a7a20c_88be_4cca_a10d_8ac9a898f090.slice/crio-conmon-d413b98ba8fa260917836f4467fcbcf5e2d0719ced6a39881431b8bbdbf81154.scope\": RecentStats: unable to find data in memory cache]" Dec 03 14:34:00 crc kubenswrapper[4751]: I1203 14:34:00.463667 4751 generic.go:334] "Generic (PLEG): container finished" podID="a8a7a20c-88be-4cca-a10d-8ac9a898f090" containerID="d413b98ba8fa260917836f4467fcbcf5e2d0719ced6a39881431b8bbdbf81154" exitCode=0 Dec 03 14:34:00 crc kubenswrapper[4751]: I1203 14:34:00.463773 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8a7a20c-88be-4cca-a10d-8ac9a898f090","Type":"ContainerDied","Data":"d413b98ba8fa260917836f4467fcbcf5e2d0719ced6a39881431b8bbdbf81154"} Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.502460 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" event={"ID":"e2d4448e-9181-494b-bec0-12da338b184d","Type":"ContainerStarted","Data":"ff804fc93a956e995a0afa5217b77594052876e511dfb8402ade0a3d2c51ff0c"} Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.502905 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.531044 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"47b63367-ad69-4428-9c79-8eee86b817ac","Type":"ContainerStarted","Data":"dc5a97df8731908a544fde7ee0074065ca6236a194a83bc37dc69e7162e2641e"} Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.534152 4751 generic.go:334] "Generic (PLEG): container finished" podID="3dc63449-cac9-48bc-abb7-3ff350a408cf" containerID="0f96069925d709a650e9202cc05ffc71c5fd9b918ef225c25dfa185bc59f5fa0" exitCode=0 Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.534228 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3dc63449-cac9-48bc-abb7-3ff350a408cf","Type":"ContainerDied","Data":"0f96069925d709a650e9202cc05ffc71c5fd9b918ef225c25dfa185bc59f5fa0"} Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.563015 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" event={"ID":"4797e85e-ad67-454b-b210-25f5481780c5","Type":"ContainerStarted","Data":"80a6c7615fb58a07b077d810a9306739b7952ccb1a64f14aced31ad1f7d4ba08"} Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.563962 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.581432 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"85623735-2d6a-4d53-ac14-e4cd714ecc7b","Type":"ContainerStarted","Data":"96413ca70c1209632b0571e3711d2b60c8a36cc98e624f37dfdd98bfee530f80"} Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.584648 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.623022 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" podStartSLOduration=-9223371989.231771 podStartE2EDuration="47.623003893s" podCreationTimestamp="2025-12-03 14:33:14 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.4213908 +0000 UTC m=+1214.409746017" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:01.534075452 +0000 UTC m=+1248.522430669" watchObservedRunningTime="2025-12-03 14:34:01.623003893 +0000 UTC m=+1248.611359100" Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.670275 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=-9223371989.184523 podStartE2EDuration="47.670253765s" podCreationTimestamp="2025-12-03 14:33:14 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.819136591 +0000 UTC m=+1214.807491798" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:01.655364515 +0000 UTC m=+1248.643719732" watchObservedRunningTime="2025-12-03 14:34:01.670253765 +0000 UTC m=+1248.658608982" Dec 03 14:34:01 crc kubenswrapper[4751]: I1203 14:34:01.679251 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" podStartSLOduration=-9223371989.175543 podStartE2EDuration="47.679232302s" podCreationTimestamp="2025-12-03 14:33:14 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.444506927 +0000 UTC m=+1214.432862144" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:01.67442422 +0000 UTC m=+1248.662779447" watchObservedRunningTime="2025-12-03 14:34:01.679232302 +0000 UTC m=+1248.667587529" Dec 03 14:34:02 crc kubenswrapper[4751]: I1203 14:34:02.594169 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3dc63449-cac9-48bc-abb7-3ff350a408cf","Type":"ContainerStarted","Data":"4ecca28cddcb8b835a87343eb7cd534e7912ad8807f2fb8b0a4e6f441b34a8ba"} Dec 03 14:34:02 crc kubenswrapper[4751]: I1203 14:34:02.597966 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"34351ff3-ea5e-403c-9d04-ca6777287cff","Type":"ContainerStarted","Data":"ab5dc761866b1f75d8d1475c4842b509a2d9ebc8b0c5171215b01e3cfec1d99e"} Dec 03 14:34:02 crc kubenswrapper[4751]: I1203 14:34:02.607518 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:34:02 crc kubenswrapper[4751]: I1203 14:34:02.619625 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.558758966 podStartE2EDuration="1m0.619605456s" podCreationTimestamp="2025-12-03 14:33:02 +0000 UTC" firstStartedPulling="2025-12-03 14:33:25.72923683 +0000 UTC m=+1212.717592047" lastFinishedPulling="2025-12-03 14:33:55.79008332 +0000 UTC m=+1242.778438537" observedRunningTime="2025-12-03 14:34:02.6168404 +0000 UTC m=+1249.605195617" watchObservedRunningTime="2025-12-03 14:34:02.619605456 +0000 UTC m=+1249.607960673" Dec 03 14:34:02 crc kubenswrapper[4751]: I1203 14:34:02.656000 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=-9223371988.198801 podStartE2EDuration="48.655974588s" podCreationTimestamp="2025-12-03 14:33:14 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.821681131 +0000 UTC m=+1214.810036348" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:02.642276121 +0000 UTC m=+1249.630631348" watchObservedRunningTime="2025-12-03 14:34:02.655974588 +0000 UTC m=+1249.644329805" Dec 03 14:34:03 crc kubenswrapper[4751]: I1203 14:34:03.608409 4751 generic.go:334] "Generic (PLEG): container finished" podID="a45965be-01f0-4c6d-9db8-08b5e5564c5a" containerID="70b6d89a24f8431f7169261d896f69011d0e6ccc9f05845c1264d18a050cf01e" exitCode=0 Dec 03 14:34:03 crc kubenswrapper[4751]: I1203 14:34:03.608616 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a45965be-01f0-4c6d-9db8-08b5e5564c5a","Type":"ContainerDied","Data":"70b6d89a24f8431f7169261d896f69011d0e6ccc9f05845c1264d18a050cf01e"} Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.129068 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.129872 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.404527 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.462486 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.523303 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-57s2g"] Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.621693 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a45965be-01f0-4c6d-9db8-08b5e5564c5a","Type":"ContainerStarted","Data":"dc9ee9572a5e2a8c12f0418d975a274eff8e3cde17126d6f8089177058fa632b"} Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.638659 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8a7a20c-88be-4cca-a10d-8ac9a898f090","Type":"ContainerStarted","Data":"61fb691a51268479dfed2a17466b1b1d6e8575eb36e419792e3e7302c37fb7c4"} Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.650620 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" event={"ID":"ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa","Type":"ContainerStarted","Data":"449599aa725d7af86559509439f228c2d26664f1a4b449c29666e757542e4584"} Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.651415 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.655224 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371973.199572 podStartE2EDuration="1m3.655203489s" podCreationTimestamp="2025-12-03 14:33:01 +0000 UTC" firstStartedPulling="2025-12-03 14:33:26.122147518 +0000 UTC m=+1213.110502735" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:04.647154467 +0000 UTC m=+1251.635509684" watchObservedRunningTime="2025-12-03 14:34:04.655203489 +0000 UTC m=+1251.643558706" Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.664090 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" event={"ID":"c1c24fdf-0c9e-458f-9803-87e9d6c3161f","Type":"ContainerStarted","Data":"46b7ec9eb26edc451586c06c4675fa9010c0284eeeb117e7cfdb6ee1d1f06d6b"} Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.665285 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.670677 4751 generic.go:334] "Generic (PLEG): container finished" podID="9de5857f-8fe8-48e3-991b-7171fc510567" containerID="a8f3c53b4b5ec7dd11766cf6c91eb7017eb00024f50e0bb96b916819e86aea84" exitCode=0 Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.670774 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerDied","Data":"a8f3c53b4b5ec7dd11766cf6c91eb7017eb00024f50e0bb96b916819e86aea84"} Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.670859 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" podUID="a3682596-4b7c-48b0-abec-da171121c1f9" containerName="dnsmasq-dns" containerID="cri-o://b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726" gracePeriod=10 Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.706899 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" podStartSLOduration=-9223371986.147911 podStartE2EDuration="50.706864763s" podCreationTimestamp="2025-12-03 14:33:14 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.473852986 +0000 UTC m=+1214.462208203" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:04.688447295 +0000 UTC m=+1251.676802522" watchObservedRunningTime="2025-12-03 14:34:04.706864763 +0000 UTC m=+1251.695219980" Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.729778 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" Dec 03 14:34:04 crc kubenswrapper[4751]: I1203 14:34:04.793762 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-bl8ws" podStartSLOduration=-9223371986.061035 podStartE2EDuration="50.793741417s" podCreationTimestamp="2025-12-03 14:33:14 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.862200148 +0000 UTC m=+1214.850555365" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:04.763974726 +0000 UTC m=+1251.752329943" watchObservedRunningTime="2025-12-03 14:34:04.793741417 +0000 UTC m=+1251.782096634" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.266489 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.366725 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-dns-svc\") pod \"a3682596-4b7c-48b0-abec-da171121c1f9\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.366872 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbfwk\" (UniqueName: \"kubernetes.io/projected/a3682596-4b7c-48b0-abec-da171121c1f9-kube-api-access-kbfwk\") pod \"a3682596-4b7c-48b0-abec-da171121c1f9\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.366941 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-config\") pod \"a3682596-4b7c-48b0-abec-da171121c1f9\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.367033 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-ovsdbserver-nb\") pod \"a3682596-4b7c-48b0-abec-da171121c1f9\" (UID: \"a3682596-4b7c-48b0-abec-da171121c1f9\") " Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.372958 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3682596-4b7c-48b0-abec-da171121c1f9-kube-api-access-kbfwk" (OuterVolumeSpecName: "kube-api-access-kbfwk") pod "a3682596-4b7c-48b0-abec-da171121c1f9" (UID: "a3682596-4b7c-48b0-abec-da171121c1f9"). InnerVolumeSpecName "kube-api-access-kbfwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.417102 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3682596-4b7c-48b0-abec-da171121c1f9" (UID: "a3682596-4b7c-48b0-abec-da171121c1f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.419732 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-config" (OuterVolumeSpecName: "config") pod "a3682596-4b7c-48b0-abec-da171121c1f9" (UID: "a3682596-4b7c-48b0-abec-da171121c1f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.425461 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3682596-4b7c-48b0-abec-da171121c1f9" (UID: "a3682596-4b7c-48b0-abec-da171121c1f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.468852 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.468887 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbfwk\" (UniqueName: \"kubernetes.io/projected/a3682596-4b7c-48b0-abec-da171121c1f9-kube-api-access-kbfwk\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.468899 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.468907 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3682596-4b7c-48b0-abec-da171121c1f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.691363 4751 generic.go:334] "Generic (PLEG): container finished" podID="a3682596-4b7c-48b0-abec-da171121c1f9" containerID="b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726" exitCode=0 Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.691741 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" event={"ID":"a3682596-4b7c-48b0-abec-da171121c1f9","Type":"ContainerDied","Data":"b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726"} Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.691788 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" event={"ID":"a3682596-4b7c-48b0-abec-da171121c1f9","Type":"ContainerDied","Data":"b97dca4a2c82d777c0ba4f314e57e61d0dbab381eb3305ec15e8c4bb9de2a662"} Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.691806 4751 scope.go:117] "RemoveContainer" containerID="b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.691951 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-57s2g" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.820075 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.820437 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.931715 4751 scope.go:117] "RemoveContainer" containerID="7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376" Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.970424 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-57s2g"] Dec 03 14:34:05 crc kubenswrapper[4751]: I1203 14:34:05.977621 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-57s2g"] Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.077062 4751 scope.go:117] "RemoveContainer" containerID="b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726" Dec 03 14:34:06 crc kubenswrapper[4751]: E1203 14:34:06.077550 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726\": container with ID starting with b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726 not found: ID does not exist" containerID="b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.077587 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726"} err="failed to get container status \"b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726\": rpc error: code = NotFound desc = could not find container \"b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726\": container with ID starting with b9f06e97893f557ff306869ad18ff41f8eef80d4434d4c827969514358009726 not found: ID does not exist" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.077608 4751 scope.go:117] "RemoveContainer" containerID="7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376" Dec 03 14:34:06 crc kubenswrapper[4751]: E1203 14:34:06.077849 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376\": container with ID starting with 7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376 not found: ID does not exist" containerID="7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.077877 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376"} err="failed to get container status \"7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376\": rpc error: code = NotFound desc = could not find container \"7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376\": container with ID starting with 7eee64bfc7012af8582b63ade90314213d34222c24a20c30b956f8a25fe35376 not found: ID does not exist" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.262728 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jdm6k"] Dec 03 14:34:06 crc kubenswrapper[4751]: E1203 14:34:06.263090 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3682596-4b7c-48b0-abec-da171121c1f9" containerName="dnsmasq-dns" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.263102 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3682596-4b7c-48b0-abec-da171121c1f9" containerName="dnsmasq-dns" Dec 03 14:34:06 crc kubenswrapper[4751]: E1203 14:34:06.263125 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3682596-4b7c-48b0-abec-da171121c1f9" containerName="init" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.263131 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3682596-4b7c-48b0-abec-da171121c1f9" containerName="init" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.263306 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3682596-4b7c-48b0-abec-da171121c1f9" containerName="dnsmasq-dns" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.264459 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.291349 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jdm6k"] Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.292430 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.292483 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-config\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.292587 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.292687 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.292714 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfvjx\" (UniqueName: \"kubernetes.io/projected/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-kube-api-access-pfvjx\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.395308 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.395440 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.395488 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfvjx\" (UniqueName: \"kubernetes.io/projected/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-kube-api-access-pfvjx\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.395508 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.395535 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-config\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.409679 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-config\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.409728 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.415938 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.416137 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.430811 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfvjx\" (UniqueName: \"kubernetes.io/projected/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-kube-api-access-pfvjx\") pod \"dnsmasq-dns-b8fbc5445-jdm6k\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.532312 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.710179 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.755356 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a8a7a20c-88be-4cca-a10d-8ac9a898f090","Type":"ContainerStarted","Data":"2c60f1d1d9d113a4551e875b06a17521db93315ea9887943afb1c3defa824cf4"} Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.756881 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.759178 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.798088 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"80f132c3-7b27-4d3d-950e-9c6aa887b6a7","Type":"ContainerStarted","Data":"dd048db6124028ffa9366bf48ace771c675db3313b51ccfaf4839b11002fcf56"} Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.798337 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.867082 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=25.067750961 podStartE2EDuration="1m0.867065111s" podCreationTimestamp="2025-12-03 14:33:06 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.819081579 +0000 UTC m=+1214.807436796" lastFinishedPulling="2025-12-03 14:34:03.618395729 +0000 UTC m=+1250.606750946" observedRunningTime="2025-12-03 14:34:06.820666272 +0000 UTC m=+1253.809021499" watchObservedRunningTime="2025-12-03 14:34:06.867065111 +0000 UTC m=+1253.855420328" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.883663 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=22.825342741 podStartE2EDuration="1m1.883644417s" podCreationTimestamp="2025-12-03 14:33:05 +0000 UTC" firstStartedPulling="2025-12-03 14:33:26.684464133 +0000 UTC m=+1213.672819350" lastFinishedPulling="2025-12-03 14:34:05.742765809 +0000 UTC m=+1252.731121026" observedRunningTime="2025-12-03 14:34:06.880044658 +0000 UTC m=+1253.868399875" watchObservedRunningTime="2025-12-03 14:34:06.883644417 +0000 UTC m=+1253.871999634" Dec 03 14:34:06 crc kubenswrapper[4751]: I1203 14:34:06.935508 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 14:34:07 crc kubenswrapper[4751]: W1203 14:34:07.321807 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod791ff53a_abb1_4fc7_bb6a_59e11d2b1b58.slice/crio-e6912e5ffe96e9119992697d67de9f17890129c6f3a90b5e96fed31cb03dbd42 WatchSource:0}: Error finding container e6912e5ffe96e9119992697d67de9f17890129c6f3a90b5e96fed31cb03dbd42: Status 404 returned error can't find the container with id e6912e5ffe96e9119992697d67de9f17890129c6f3a90b5e96fed31cb03dbd42 Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.323739 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3682596-4b7c-48b0-abec-da171121c1f9" path="/var/lib/kubelet/pods/a3682596-4b7c-48b0-abec-da171121c1f9/volumes" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.324295 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jdm6k"] Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.397986 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.403737 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.408999 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-qqhkb" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.409178 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.409335 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.410611 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.426157 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.517825 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4bgn\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-kube-api-access-v4bgn\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.517891 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17ddfb66-175c-4e13-9331-0c6afa5115de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17ddfb66-175c-4e13-9331-0c6afa5115de\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.517925 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.517946 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-lock\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.518007 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-cache\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.620168 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4bgn\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-kube-api-access-v4bgn\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.620671 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17ddfb66-175c-4e13-9331-0c6afa5115de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17ddfb66-175c-4e13-9331-0c6afa5115de\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.620786 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.620900 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-lock\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.621021 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-cache\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: E1203 14:34:07.620941 4751 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:34:07 crc kubenswrapper[4751]: E1203 14:34:07.621707 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:34:07 crc kubenswrapper[4751]: E1203 14:34:07.622304 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift podName:a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2 nodeName:}" failed. No retries permitted until 2025-12-03 14:34:08.122287222 +0000 UTC m=+1255.110642439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift") pod "swift-storage-0" (UID: "a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2") : configmap "swift-ring-files" not found Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.621366 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-lock\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.621497 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-cache\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.629185 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.629232 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17ddfb66-175c-4e13-9331-0c6afa5115de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17ddfb66-175c-4e13-9331-0c6afa5115de\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0a56736807043a616a33f5c587b3544b9025bf88622c69243bc7b045013f40cd/globalmount\"" pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.639233 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4bgn\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-kube-api-access-v4bgn\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.691179 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17ddfb66-175c-4e13-9331-0c6afa5115de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17ddfb66-175c-4e13-9331-0c6afa5115de\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.831447 4751 generic.go:334] "Generic (PLEG): container finished" podID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerID="d5455fd4736f3358c46f6726abb1d727db7e38e6bf0a356841dab5b6d1ea12b4" exitCode=0 Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.832901 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" event={"ID":"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58","Type":"ContainerDied","Data":"d5455fd4736f3358c46f6726abb1d727db7e38e6bf0a356841dab5b6d1ea12b4"} Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.832999 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" event={"ID":"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58","Type":"ContainerStarted","Data":"e6912e5ffe96e9119992697d67de9f17890129c6f3a90b5e96fed31cb03dbd42"} Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.923037 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-68rzl"] Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.924398 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.927313 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.927650 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.927837 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 14:34:07 crc kubenswrapper[4751]: I1203 14:34:07.932130 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-68rzl"] Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.028752 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-ring-data-devices\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.028805 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-scripts\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.028840 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlf99\" (UniqueName: \"kubernetes.io/projected/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-kube-api-access-dlf99\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.028946 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-etc-swift\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.029016 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-swiftconf\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.029036 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-dispersionconf\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.029113 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-combined-ca-bundle\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.130957 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-etc-swift\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.131222 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-swiftconf\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.131244 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-dispersionconf\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.131273 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-combined-ca-bundle\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.131310 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.131372 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-ring-data-devices\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.131395 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-scripts\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.131434 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlf99\" (UniqueName: \"kubernetes.io/projected/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-kube-api-access-dlf99\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.131437 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-etc-swift\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: E1203 14:34:08.131581 4751 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:34:08 crc kubenswrapper[4751]: E1203 14:34:08.131594 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:34:08 crc kubenswrapper[4751]: E1203 14:34:08.131638 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift podName:a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2 nodeName:}" failed. No retries permitted until 2025-12-03 14:34:09.131623926 +0000 UTC m=+1256.119979143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift") pod "swift-storage-0" (UID: "a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2") : configmap "swift-ring-files" not found Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.132352 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-ring-data-devices\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.133553 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-scripts\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.137371 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-swiftconf\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.137456 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-dispersionconf\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.138080 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-combined-ca-bundle\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.152713 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlf99\" (UniqueName: \"kubernetes.io/projected/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-kube-api-access-dlf99\") pod \"swift-ring-rebalance-68rzl\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.316144 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.860502 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" event={"ID":"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58","Type":"ContainerStarted","Data":"6c2e401c92d038de3a392d5e003f097d04bdb2900597efb2d2acdade9546d339"} Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.861020 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.896369 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" podStartSLOduration=2.89634669 podStartE2EDuration="2.89634669s" podCreationTimestamp="2025-12-03 14:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:08.892400931 +0000 UTC m=+1255.880756168" watchObservedRunningTime="2025-12-03 14:34:08.89634669 +0000 UTC m=+1255.884701907" Dec 03 14:34:08 crc kubenswrapper[4751]: I1203 14:34:08.911984 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-68rzl"] Dec 03 14:34:08 crc kubenswrapper[4751]: W1203 14:34:08.912573 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fedaa81_0c36_44fa_ab7b_b712759fc8d4.slice/crio-bba03347b16c2e6e27147069d38f83722fff844b6b9bfff65ccc74fb5257979e WatchSource:0}: Error finding container bba03347b16c2e6e27147069d38f83722fff844b6b9bfff65ccc74fb5257979e: Status 404 returned error can't find the container with id bba03347b16c2e6e27147069d38f83722fff844b6b9bfff65ccc74fb5257979e Dec 03 14:34:09 crc kubenswrapper[4751]: I1203 14:34:09.165758 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:09 crc kubenswrapper[4751]: E1203 14:34:09.166011 4751 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:34:09 crc kubenswrapper[4751]: E1203 14:34:09.166158 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:34:09 crc kubenswrapper[4751]: E1203 14:34:09.166230 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift podName:a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2 nodeName:}" failed. No retries permitted until 2025-12-03 14:34:11.166207776 +0000 UTC m=+1258.154562993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift") pod "swift-storage-0" (UID: "a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2") : configmap "swift-ring-files" not found Dec 03 14:34:09 crc kubenswrapper[4751]: I1203 14:34:09.872271 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-68rzl" event={"ID":"0fedaa81-0c36-44fa-ab7b-b712759fc8d4","Type":"ContainerStarted","Data":"bba03347b16c2e6e27147069d38f83722fff844b6b9bfff65ccc74fb5257979e"} Dec 03 14:34:11 crc kubenswrapper[4751]: I1203 14:34:11.202698 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:11 crc kubenswrapper[4751]: E1203 14:34:11.202945 4751 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:34:11 crc kubenswrapper[4751]: E1203 14:34:11.203135 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:34:11 crc kubenswrapper[4751]: E1203 14:34:11.203205 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift podName:a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2 nodeName:}" failed. No retries permitted until 2025-12-03 14:34:15.203177509 +0000 UTC m=+1262.191532726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift") pod "swift-storage-0" (UID: "a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2") : configmap "swift-ring-files" not found Dec 03 14:34:12 crc kubenswrapper[4751]: I1203 14:34:12.433397 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 14:34:12 crc kubenswrapper[4751]: I1203 14:34:12.433483 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 14:34:12 crc kubenswrapper[4751]: I1203 14:34:12.513899 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 14:34:13 crc kubenswrapper[4751]: I1203 14:34:13.005819 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 14:34:13 crc kubenswrapper[4751]: I1203 14:34:13.967709 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-78w4f"] Dec 03 14:34:13 crc kubenswrapper[4751]: I1203 14:34:13.969268 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:13 crc kubenswrapper[4751]: I1203 14:34:13.982877 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-78w4f"] Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.087032 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-e7ba-account-create-update-nnclp"] Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.088811 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.092897 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.096721 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e7ba-account-create-update-nnclp"] Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.160840 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d58931ec-41c3-417e-b1b6-23d8855a0dbd-operator-scripts\") pod \"keystone-db-create-78w4f\" (UID: \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\") " pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.160918 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4brr7\" (UniqueName: \"kubernetes.io/projected/d58931ec-41c3-417e-b1b6-23d8855a0dbd-kube-api-access-4brr7\") pod \"keystone-db-create-78w4f\" (UID: \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\") " pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.181624 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rrw72"] Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.182818 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rrw72" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.196011 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rrw72"] Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.262613 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d58931ec-41c3-417e-b1b6-23d8855a0dbd-operator-scripts\") pod \"keystone-db-create-78w4f\" (UID: \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\") " pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.262700 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4brr7\" (UniqueName: \"kubernetes.io/projected/d58931ec-41c3-417e-b1b6-23d8855a0dbd-kube-api-access-4brr7\") pod \"keystone-db-create-78w4f\" (UID: \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\") " pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.262789 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-operator-scripts\") pod \"keystone-e7ba-account-create-update-nnclp\" (UID: \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\") " pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.262845 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp6gb\" (UniqueName: \"kubernetes.io/projected/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-kube-api-access-sp6gb\") pod \"keystone-e7ba-account-create-update-nnclp\" (UID: \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\") " pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.263668 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d58931ec-41c3-417e-b1b6-23d8855a0dbd-operator-scripts\") pod \"keystone-db-create-78w4f\" (UID: \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\") " pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.285770 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-70be-account-create-update-rq9wc"] Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.287088 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.288364 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4brr7\" (UniqueName: \"kubernetes.io/projected/d58931ec-41c3-417e-b1b6-23d8855a0dbd-kube-api-access-4brr7\") pod \"keystone-db-create-78w4f\" (UID: \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\") " pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.289946 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.293528 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.296748 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-70be-account-create-update-rq9wc"] Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.364433 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xn4w\" (UniqueName: \"kubernetes.io/projected/638fec0e-031b-4c73-828b-95157a9dd522-kube-api-access-7xn4w\") pod \"placement-db-create-rrw72\" (UID: \"638fec0e-031b-4c73-828b-95157a9dd522\") " pod="openstack/placement-db-create-rrw72" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.364754 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-operator-scripts\") pod \"keystone-e7ba-account-create-update-nnclp\" (UID: \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\") " pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.364865 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/638fec0e-031b-4c73-828b-95157a9dd522-operator-scripts\") pod \"placement-db-create-rrw72\" (UID: \"638fec0e-031b-4c73-828b-95157a9dd522\") " pod="openstack/placement-db-create-rrw72" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.365035 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp6gb\" (UniqueName: \"kubernetes.io/projected/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-kube-api-access-sp6gb\") pod \"keystone-e7ba-account-create-update-nnclp\" (UID: \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\") " pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.365518 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-operator-scripts\") pod \"keystone-e7ba-account-create-update-nnclp\" (UID: \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\") " pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.386070 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp6gb\" (UniqueName: \"kubernetes.io/projected/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-kube-api-access-sp6gb\") pod \"keystone-e7ba-account-create-update-nnclp\" (UID: \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\") " pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.407357 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.467366 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trrsc\" (UniqueName: \"kubernetes.io/projected/27b5d97a-cd82-45be-ae36-bd97f293b7cd-kube-api-access-trrsc\") pod \"placement-70be-account-create-update-rq9wc\" (UID: \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\") " pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.467416 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b5d97a-cd82-45be-ae36-bd97f293b7cd-operator-scripts\") pod \"placement-70be-account-create-update-rq9wc\" (UID: \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\") " pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.467503 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xn4w\" (UniqueName: \"kubernetes.io/projected/638fec0e-031b-4c73-828b-95157a9dd522-kube-api-access-7xn4w\") pod \"placement-db-create-rrw72\" (UID: \"638fec0e-031b-4c73-828b-95157a9dd522\") " pod="openstack/placement-db-create-rrw72" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.467559 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/638fec0e-031b-4c73-828b-95157a9dd522-operator-scripts\") pod \"placement-db-create-rrw72\" (UID: \"638fec0e-031b-4c73-828b-95157a9dd522\") " pod="openstack/placement-db-create-rrw72" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.468237 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/638fec0e-031b-4c73-828b-95157a9dd522-operator-scripts\") pod \"placement-db-create-rrw72\" (UID: \"638fec0e-031b-4c73-828b-95157a9dd522\") " pod="openstack/placement-db-create-rrw72" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.488877 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xn4w\" (UniqueName: \"kubernetes.io/projected/638fec0e-031b-4c73-828b-95157a9dd522-kube-api-access-7xn4w\") pod \"placement-db-create-rrw72\" (UID: \"638fec0e-031b-4c73-828b-95157a9dd522\") " pod="openstack/placement-db-create-rrw72" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.503724 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rrw72" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.568836 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trrsc\" (UniqueName: \"kubernetes.io/projected/27b5d97a-cd82-45be-ae36-bd97f293b7cd-kube-api-access-trrsc\") pod \"placement-70be-account-create-update-rq9wc\" (UID: \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\") " pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.568895 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b5d97a-cd82-45be-ae36-bd97f293b7cd-operator-scripts\") pod \"placement-70be-account-create-update-rq9wc\" (UID: \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\") " pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.569672 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b5d97a-cd82-45be-ae36-bd97f293b7cd-operator-scripts\") pod \"placement-70be-account-create-update-rq9wc\" (UID: \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\") " pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.585761 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trrsc\" (UniqueName: \"kubernetes.io/projected/27b5d97a-cd82-45be-ae36-bd97f293b7cd-kube-api-access-trrsc\") pod \"placement-70be-account-create-update-rq9wc\" (UID: \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\") " pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:14 crc kubenswrapper[4751]: I1203 14:34:14.644709 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:15 crc kubenswrapper[4751]: I1203 14:34:15.283015 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:15 crc kubenswrapper[4751]: E1203 14:34:15.283215 4751 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:34:15 crc kubenswrapper[4751]: E1203 14:34:15.283502 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:34:15 crc kubenswrapper[4751]: E1203 14:34:15.283565 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift podName:a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2 nodeName:}" failed. No retries permitted until 2025-12-03 14:34:23.283548758 +0000 UTC m=+1270.271903975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift") pod "swift-storage-0" (UID: "a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2") : configmap "swift-ring-files" not found Dec 03 14:34:15 crc kubenswrapper[4751]: I1203 14:34:15.739687 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="04053d51-dddf-43e3-a230-9ac729dec435" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:34:15 crc kubenswrapper[4751]: I1203 14:34:15.982910 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 14:34:16 crc kubenswrapper[4751]: I1203 14:34:16.193266 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 14:34:16 crc kubenswrapper[4751]: I1203 14:34:16.712191 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:34:16 crc kubenswrapper[4751]: I1203 14:34:16.772632 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-b8cts"] Dec 03 14:34:16 crc kubenswrapper[4751]: I1203 14:34:16.772918 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-b8cts" podUID="9658e996-cafd-4e95-8412-4a3405de5127" containerName="dnsmasq-dns" containerID="cri-o://482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202" gracePeriod=10 Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.777570 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.832760 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-dns-svc\") pod \"9658e996-cafd-4e95-8412-4a3405de5127\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.833022 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-nb\") pod \"9658e996-cafd-4e95-8412-4a3405de5127\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.833141 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q429k\" (UniqueName: \"kubernetes.io/projected/9658e996-cafd-4e95-8412-4a3405de5127-kube-api-access-q429k\") pod \"9658e996-cafd-4e95-8412-4a3405de5127\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.833317 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-config\") pod \"9658e996-cafd-4e95-8412-4a3405de5127\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.833395 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-sb\") pod \"9658e996-cafd-4e95-8412-4a3405de5127\" (UID: \"9658e996-cafd-4e95-8412-4a3405de5127\") " Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.960560 4751 generic.go:334] "Generic (PLEG): container finished" podID="9658e996-cafd-4e95-8412-4a3405de5127" containerID="482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202" exitCode=0 Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.960633 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-b8cts" event={"ID":"9658e996-cafd-4e95-8412-4a3405de5127","Type":"ContainerDied","Data":"482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202"} Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.960685 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-b8cts" event={"ID":"9658e996-cafd-4e95-8412-4a3405de5127","Type":"ContainerDied","Data":"28a06f798f01c11bbac36a622e4279d07cd8aa5f237f232dc5246a8d8301c341"} Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.960708 4751 scope.go:117] "RemoveContainer" containerID="482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202" Dec 03 14:34:17 crc kubenswrapper[4751]: I1203 14:34:17.960653 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-b8cts" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.041658 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9658e996-cafd-4e95-8412-4a3405de5127-kube-api-access-q429k" (OuterVolumeSpecName: "kube-api-access-q429k") pod "9658e996-cafd-4e95-8412-4a3405de5127" (UID: "9658e996-cafd-4e95-8412-4a3405de5127"). InnerVolumeSpecName "kube-api-access-q429k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.062308 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9658e996-cafd-4e95-8412-4a3405de5127" (UID: "9658e996-cafd-4e95-8412-4a3405de5127"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.062337 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-config" (OuterVolumeSpecName: "config") pod "9658e996-cafd-4e95-8412-4a3405de5127" (UID: "9658e996-cafd-4e95-8412-4a3405de5127"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.072275 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9658e996-cafd-4e95-8412-4a3405de5127" (UID: "9658e996-cafd-4e95-8412-4a3405de5127"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.073605 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9658e996-cafd-4e95-8412-4a3405de5127" (UID: "9658e996-cafd-4e95-8412-4a3405de5127"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.138944 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.138973 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.138986 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.138993 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9658e996-cafd-4e95-8412-4a3405de5127-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.139002 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q429k\" (UniqueName: \"kubernetes.io/projected/9658e996-cafd-4e95-8412-4a3405de5127-kube-api-access-q429k\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.286960 4751 scope.go:117] "RemoveContainer" containerID="7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.416892 4751 scope.go:117] "RemoveContainer" containerID="482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202" Dec 03 14:34:18 crc kubenswrapper[4751]: E1203 14:34:18.417756 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202\": container with ID starting with 482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202 not found: ID does not exist" containerID="482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.417784 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202"} err="failed to get container status \"482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202\": rpc error: code = NotFound desc = could not find container \"482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202\": container with ID starting with 482c6a008a4f547371e797fa2bc28fc9f634ebb002ff1e4b3e7a5b983db3e202 not found: ID does not exist" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.417803 4751 scope.go:117] "RemoveContainer" containerID="7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20" Dec 03 14:34:18 crc kubenswrapper[4751]: E1203 14:34:18.425515 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20\": container with ID starting with 7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20 not found: ID does not exist" containerID="7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.425569 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20"} err="failed to get container status \"7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20\": rpc error: code = NotFound desc = could not find container \"7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20\": container with ID starting with 7f108f47b996a17de108176a8a8d9c781ac4a0dce762a4bb7cbe4e6edb01fa20 not found: ID does not exist" Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.434055 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-b8cts"] Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.446491 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-b8cts"] Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.768951 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e7ba-account-create-update-nnclp"] Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.777502 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-78w4f"] Dec 03 14:34:18 crc kubenswrapper[4751]: W1203 14:34:18.784361 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd58931ec_41c3_417e_b1b6_23d8855a0dbd.slice/crio-ba8bf71e302b902bb499ce2c7c9b7540baa3039ad0544edb1a3888af7d0d7a08 WatchSource:0}: Error finding container ba8bf71e302b902bb499ce2c7c9b7540baa3039ad0544edb1a3888af7d0d7a08: Status 404 returned error can't find the container with id ba8bf71e302b902bb499ce2c7c9b7540baa3039ad0544edb1a3888af7d0d7a08 Dec 03 14:34:18 crc kubenswrapper[4751]: W1203 14:34:18.787569 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5413bb7_fc5f_47b1_b2c1_03b66cab3b92.slice/crio-94a876559e0cc4c2554b83652d2817f9e47664b7c64e1680e9fd6d1c72abc353 WatchSource:0}: Error finding container 94a876559e0cc4c2554b83652d2817f9e47664b7c64e1680e9fd6d1c72abc353: Status 404 returned error can't find the container with id 94a876559e0cc4c2554b83652d2817f9e47664b7c64e1680e9fd6d1c72abc353 Dec 03 14:34:18 crc kubenswrapper[4751]: W1203 14:34:18.791888 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b5d97a_cd82_45be_ae36_bd97f293b7cd.slice/crio-9edcbd2668b39cad85f7dcbd842667eda18b4e5efafe3a1302dea745b280706c WatchSource:0}: Error finding container 9edcbd2668b39cad85f7dcbd842667eda18b4e5efafe3a1302dea745b280706c: Status 404 returned error can't find the container with id 9edcbd2668b39cad85f7dcbd842667eda18b4e5efafe3a1302dea745b280706c Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.792271 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-70be-account-create-update-rq9wc"] Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.816687 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rrw72"] Dec 03 14:34:18 crc kubenswrapper[4751]: W1203 14:34:18.826689 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod638fec0e_031b_4c73_828b_95157a9dd522.slice/crio-3b0113e9e530d9a519a1002915d185ff046b23e122d72cbb5ed334773e7f60b8 WatchSource:0}: Error finding container 3b0113e9e530d9a519a1002915d185ff046b23e122d72cbb5ed334773e7f60b8: Status 404 returned error can't find the container with id 3b0113e9e530d9a519a1002915d185ff046b23e122d72cbb5ed334773e7f60b8 Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.970024 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-78w4f" event={"ID":"d58931ec-41c3-417e-b1b6-23d8855a0dbd","Type":"ContainerStarted","Data":"ba8bf71e302b902bb499ce2c7c9b7540baa3039ad0544edb1a3888af7d0d7a08"} Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.971992 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerStarted","Data":"798a25e0fbe45250e4064ec82ae98343730da2aa5dc7e19451e9af976b8235f2"} Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.973234 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e7ba-account-create-update-nnclp" event={"ID":"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92","Type":"ContainerStarted","Data":"94a876559e0cc4c2554b83652d2817f9e47664b7c64e1680e9fd6d1c72abc353"} Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.975110 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0046f111-cf94-402b-8981-659978aace04","Type":"ContainerStarted","Data":"56b0cbda322beafdd254d16272c46350b228c78b19a7c5a75137283d8a90fa8a"} Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.976816 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rrw72" event={"ID":"638fec0e-031b-4c73-828b-95157a9dd522","Type":"ContainerStarted","Data":"3b0113e9e530d9a519a1002915d185ff046b23e122d72cbb5ed334773e7f60b8"} Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.978141 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-68rzl" event={"ID":"0fedaa81-0c36-44fa-ab7b-b712759fc8d4","Type":"ContainerStarted","Data":"aaa8be54f51aea2b3f2622fd12cc233a7e8d74a81f3b2648602ff832bc8227cf"} Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.980533 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-70be-account-create-update-rq9wc" event={"ID":"27b5d97a-cd82-45be-ae36-bd97f293b7cd","Type":"ContainerStarted","Data":"9edcbd2668b39cad85f7dcbd842667eda18b4e5efafe3a1302dea745b280706c"} Dec 03 14:34:18 crc kubenswrapper[4751]: I1203 14:34:18.998271 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.552238065 podStartE2EDuration="1m4.998253632s" podCreationTimestamp="2025-12-03 14:33:14 +0000 UTC" firstStartedPulling="2025-12-03 14:33:26.841144171 +0000 UTC m=+1213.829499388" lastFinishedPulling="2025-12-03 14:34:18.287159728 +0000 UTC m=+1265.275514955" observedRunningTime="2025-12-03 14:34:18.996241507 +0000 UTC m=+1265.984596734" watchObservedRunningTime="2025-12-03 14:34:18.998253632 +0000 UTC m=+1265.986608849" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.018851 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-68rzl" podStartSLOduration=3.258711261 podStartE2EDuration="12.018833939s" podCreationTimestamp="2025-12-03 14:34:07 +0000 UTC" firstStartedPulling="2025-12-03 14:34:08.914066538 +0000 UTC m=+1255.902421755" lastFinishedPulling="2025-12-03 14:34:17.674189216 +0000 UTC m=+1264.662544433" observedRunningTime="2025-12-03 14:34:19.014491899 +0000 UTC m=+1266.002847136" watchObservedRunningTime="2025-12-03 14:34:19.018833939 +0000 UTC m=+1266.007189156" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.326165 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9658e996-cafd-4e95-8412-4a3405de5127" path="/var/lib/kubelet/pods/9658e996-cafd-4e95-8412-4a3405de5127/volumes" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.449710 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7q6w6"] Dec 03 14:34:19 crc kubenswrapper[4751]: E1203 14:34:19.450745 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9658e996-cafd-4e95-8412-4a3405de5127" containerName="init" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.450769 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9658e996-cafd-4e95-8412-4a3405de5127" containerName="init" Dec 03 14:34:19 crc kubenswrapper[4751]: E1203 14:34:19.450808 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9658e996-cafd-4e95-8412-4a3405de5127" containerName="dnsmasq-dns" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.450817 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9658e996-cafd-4e95-8412-4a3405de5127" containerName="dnsmasq-dns" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.451380 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="9658e996-cafd-4e95-8412-4a3405de5127" containerName="dnsmasq-dns" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.452626 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.474266 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7q6w6"] Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.526954 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bddc-account-create-update-zv8f2"] Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.528234 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.530596 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.536534 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bddc-account-create-update-zv8f2"] Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.582752 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1389c328-4d38-4b2a-b8ca-54e69fa59035-operator-scripts\") pod \"glance-db-create-7q6w6\" (UID: \"1389c328-4d38-4b2a-b8ca-54e69fa59035\") " pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.582826 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjdmw\" (UniqueName: \"kubernetes.io/projected/1389c328-4d38-4b2a-b8ca-54e69fa59035-kube-api-access-gjdmw\") pod \"glance-db-create-7q6w6\" (UID: \"1389c328-4d38-4b2a-b8ca-54e69fa59035\") " pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.684378 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjdmw\" (UniqueName: \"kubernetes.io/projected/1389c328-4d38-4b2a-b8ca-54e69fa59035-kube-api-access-gjdmw\") pod \"glance-db-create-7q6w6\" (UID: \"1389c328-4d38-4b2a-b8ca-54e69fa59035\") " pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.684446 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b70371-2bd9-44dc-a70a-c522ffb2125a-operator-scripts\") pod \"glance-bddc-account-create-update-zv8f2\" (UID: \"24b70371-2bd9-44dc-a70a-c522ffb2125a\") " pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.684497 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tcwt\" (UniqueName: \"kubernetes.io/projected/24b70371-2bd9-44dc-a70a-c522ffb2125a-kube-api-access-8tcwt\") pod \"glance-bddc-account-create-update-zv8f2\" (UID: \"24b70371-2bd9-44dc-a70a-c522ffb2125a\") " pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.684771 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1389c328-4d38-4b2a-b8ca-54e69fa59035-operator-scripts\") pod \"glance-db-create-7q6w6\" (UID: \"1389c328-4d38-4b2a-b8ca-54e69fa59035\") " pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.685449 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1389c328-4d38-4b2a-b8ca-54e69fa59035-operator-scripts\") pod \"glance-db-create-7q6w6\" (UID: \"1389c328-4d38-4b2a-b8ca-54e69fa59035\") " pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.702622 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjdmw\" (UniqueName: \"kubernetes.io/projected/1389c328-4d38-4b2a-b8ca-54e69fa59035-kube-api-access-gjdmw\") pod \"glance-db-create-7q6w6\" (UID: \"1389c328-4d38-4b2a-b8ca-54e69fa59035\") " pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.781965 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.786190 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b70371-2bd9-44dc-a70a-c522ffb2125a-operator-scripts\") pod \"glance-bddc-account-create-update-zv8f2\" (UID: \"24b70371-2bd9-44dc-a70a-c522ffb2125a\") " pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.786257 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tcwt\" (UniqueName: \"kubernetes.io/projected/24b70371-2bd9-44dc-a70a-c522ffb2125a-kube-api-access-8tcwt\") pod \"glance-bddc-account-create-update-zv8f2\" (UID: \"24b70371-2bd9-44dc-a70a-c522ffb2125a\") " pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.786872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b70371-2bd9-44dc-a70a-c522ffb2125a-operator-scripts\") pod \"glance-bddc-account-create-update-zv8f2\" (UID: \"24b70371-2bd9-44dc-a70a-c522ffb2125a\") " pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.804899 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tcwt\" (UniqueName: \"kubernetes.io/projected/24b70371-2bd9-44dc-a70a-c522ffb2125a-kube-api-access-8tcwt\") pod \"glance-bddc-account-create-update-zv8f2\" (UID: \"24b70371-2bd9-44dc-a70a-c522ffb2125a\") " pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.884517 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.995517 4751 generic.go:334] "Generic (PLEG): container finished" podID="27b5d97a-cd82-45be-ae36-bd97f293b7cd" containerID="9111a215363d6793c77b2eae0f23c40c552f7da495991347f3e64e2098908928" exitCode=0 Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.995594 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-70be-account-create-update-rq9wc" event={"ID":"27b5d97a-cd82-45be-ae36-bd97f293b7cd","Type":"ContainerDied","Data":"9111a215363d6793c77b2eae0f23c40c552f7da495991347f3e64e2098908928"} Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.997021 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5413bb7-fc5f-47b1-b2c1-03b66cab3b92" containerID="d68cad0cf59293845aa503e209787d751c7f3e42a159b872c3be012db60052e6" exitCode=0 Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.997086 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e7ba-account-create-update-nnclp" event={"ID":"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92","Type":"ContainerDied","Data":"d68cad0cf59293845aa503e209787d751c7f3e42a159b872c3be012db60052e6"} Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.998433 4751 generic.go:334] "Generic (PLEG): container finished" podID="638fec0e-031b-4c73-828b-95157a9dd522" containerID="280e5644e04e23f95863b6ce318429e83b3341651ad057c40c3a32e7d3dcb109" exitCode=0 Dec 03 14:34:19 crc kubenswrapper[4751]: I1203 14:34:19.998485 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rrw72" event={"ID":"638fec0e-031b-4c73-828b-95157a9dd522","Type":"ContainerDied","Data":"280e5644e04e23f95863b6ce318429e83b3341651ad057c40c3a32e7d3dcb109"} Dec 03 14:34:20 crc kubenswrapper[4751]: I1203 14:34:20.000351 4751 generic.go:334] "Generic (PLEG): container finished" podID="d58931ec-41c3-417e-b1b6-23d8855a0dbd" containerID="b49be2cd041e8658f56e42545f4ba8069c92543eb86398558c0acbcac2011e7b" exitCode=0 Dec 03 14:34:20 crc kubenswrapper[4751]: I1203 14:34:20.000606 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-78w4f" event={"ID":"d58931ec-41c3-417e-b1b6-23d8855a0dbd","Type":"ContainerDied","Data":"b49be2cd041e8658f56e42545f4ba8069c92543eb86398558c0acbcac2011e7b"} Dec 03 14:34:20 crc kubenswrapper[4751]: I1203 14:34:20.229232 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7q6w6"] Dec 03 14:34:20 crc kubenswrapper[4751]: W1203 14:34:20.234992 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1389c328_4d38_4b2a_b8ca_54e69fa59035.slice/crio-431d56c624c9ce7cb7242bb60460bc6264f5cbf2a9d2ec0677ce3f05dd7727ce WatchSource:0}: Error finding container 431d56c624c9ce7cb7242bb60460bc6264f5cbf2a9d2ec0677ce3f05dd7727ce: Status 404 returned error can't find the container with id 431d56c624c9ce7cb7242bb60460bc6264f5cbf2a9d2ec0677ce3f05dd7727ce Dec 03 14:34:20 crc kubenswrapper[4751]: I1203 14:34:20.365522 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bddc-account-create-update-zv8f2"] Dec 03 14:34:20 crc kubenswrapper[4751]: W1203 14:34:20.373696 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24b70371_2bd9_44dc_a70a_c522ffb2125a.slice/crio-df1d6a14f83cf02e89a252d4779e5e6079c3af8163c7e73322080194f7040c2e WatchSource:0}: Error finding container df1d6a14f83cf02e89a252d4779e5e6079c3af8163c7e73322080194f7040c2e: Status 404 returned error can't find the container with id df1d6a14f83cf02e89a252d4779e5e6079c3af8163c7e73322080194f7040c2e Dec 03 14:34:20 crc kubenswrapper[4751]: I1203 14:34:20.482497 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.013673 4751 generic.go:334] "Generic (PLEG): container finished" podID="24b70371-2bd9-44dc-a70a-c522ffb2125a" containerID="3b38dca0d16b04e991c81cc6650ca0295fc09d83725385feb137aaaebcaece60" exitCode=0 Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.013818 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bddc-account-create-update-zv8f2" event={"ID":"24b70371-2bd9-44dc-a70a-c522ffb2125a","Type":"ContainerDied","Data":"3b38dca0d16b04e991c81cc6650ca0295fc09d83725385feb137aaaebcaece60"} Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.014109 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bddc-account-create-update-zv8f2" event={"ID":"24b70371-2bd9-44dc-a70a-c522ffb2125a","Type":"ContainerStarted","Data":"df1d6a14f83cf02e89a252d4779e5e6079c3af8163c7e73322080194f7040c2e"} Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.017068 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7q6w6" event={"ID":"1389c328-4d38-4b2a-b8ca-54e69fa59035","Type":"ContainerStarted","Data":"13360b1a7e8e5c7cf3f680c1b3d28fa68e7da42d8b3488ab7040dfc6ff71dffb"} Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.017128 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7q6w6" event={"ID":"1389c328-4d38-4b2a-b8ca-54e69fa59035","Type":"ContainerStarted","Data":"431d56c624c9ce7cb7242bb60460bc6264f5cbf2a9d2ec0677ce3f05dd7727ce"} Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.052054 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-7q6w6" podStartSLOduration=2.052034267 podStartE2EDuration="2.052034267s" podCreationTimestamp="2025-12-03 14:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:21.047757969 +0000 UTC m=+1268.036113186" watchObservedRunningTime="2025-12-03 14:34:21.052034267 +0000 UTC m=+1268.040389484" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.491065 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.513014 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.560396 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.733926 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b5d97a-cd82-45be-ae36-bd97f293b7cd-operator-scripts\") pod \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\" (UID: \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\") " Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.734037 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trrsc\" (UniqueName: \"kubernetes.io/projected/27b5d97a-cd82-45be-ae36-bd97f293b7cd-kube-api-access-trrsc\") pod \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\" (UID: \"27b5d97a-cd82-45be-ae36-bd97f293b7cd\") " Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.734534 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27b5d97a-cd82-45be-ae36-bd97f293b7cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27b5d97a-cd82-45be-ae36-bd97f293b7cd" (UID: "27b5d97a-cd82-45be-ae36-bd97f293b7cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.734884 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27b5d97a-cd82-45be-ae36-bd97f293b7cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.741176 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b5d97a-cd82-45be-ae36-bd97f293b7cd-kube-api-access-trrsc" (OuterVolumeSpecName: "kube-api-access-trrsc") pod "27b5d97a-cd82-45be-ae36-bd97f293b7cd" (UID: "27b5d97a-cd82-45be-ae36-bd97f293b7cd"). InnerVolumeSpecName "kube-api-access-trrsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.767606 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.785026 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.799379 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rrw72" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.837578 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trrsc\" (UniqueName: \"kubernetes.io/projected/27b5d97a-cd82-45be-ae36-bd97f293b7cd-kube-api-access-trrsc\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.939409 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-operator-scripts\") pod \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\" (UID: \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\") " Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.939472 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4brr7\" (UniqueName: \"kubernetes.io/projected/d58931ec-41c3-417e-b1b6-23d8855a0dbd-kube-api-access-4brr7\") pod \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\" (UID: \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\") " Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.939555 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp6gb\" (UniqueName: \"kubernetes.io/projected/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-kube-api-access-sp6gb\") pod \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\" (UID: \"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92\") " Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.939573 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d58931ec-41c3-417e-b1b6-23d8855a0dbd-operator-scripts\") pod \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\" (UID: \"d58931ec-41c3-417e-b1b6-23d8855a0dbd\") " Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.939592 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xn4w\" (UniqueName: \"kubernetes.io/projected/638fec0e-031b-4c73-828b-95157a9dd522-kube-api-access-7xn4w\") pod \"638fec0e-031b-4c73-828b-95157a9dd522\" (UID: \"638fec0e-031b-4c73-828b-95157a9dd522\") " Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.939699 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/638fec0e-031b-4c73-828b-95157a9dd522-operator-scripts\") pod \"638fec0e-031b-4c73-828b-95157a9dd522\" (UID: \"638fec0e-031b-4c73-828b-95157a9dd522\") " Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.940417 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d58931ec-41c3-417e-b1b6-23d8855a0dbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d58931ec-41c3-417e-b1b6-23d8855a0dbd" (UID: "d58931ec-41c3-417e-b1b6-23d8855a0dbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.940424 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/638fec0e-031b-4c73-828b-95157a9dd522-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "638fec0e-031b-4c73-828b-95157a9dd522" (UID: "638fec0e-031b-4c73-828b-95157a9dd522"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.940654 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5413bb7-fc5f-47b1-b2c1-03b66cab3b92" (UID: "d5413bb7-fc5f-47b1-b2c1-03b66cab3b92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.945006 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58931ec-41c3-417e-b1b6-23d8855a0dbd-kube-api-access-4brr7" (OuterVolumeSpecName: "kube-api-access-4brr7") pod "d58931ec-41c3-417e-b1b6-23d8855a0dbd" (UID: "d58931ec-41c3-417e-b1b6-23d8855a0dbd"). InnerVolumeSpecName "kube-api-access-4brr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.945465 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/638fec0e-031b-4c73-828b-95157a9dd522-kube-api-access-7xn4w" (OuterVolumeSpecName: "kube-api-access-7xn4w") pod "638fec0e-031b-4c73-828b-95157a9dd522" (UID: "638fec0e-031b-4c73-828b-95157a9dd522"). InnerVolumeSpecName "kube-api-access-7xn4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:21 crc kubenswrapper[4751]: I1203 14:34:21.948819 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-kube-api-access-sp6gb" (OuterVolumeSpecName: "kube-api-access-sp6gb") pod "d5413bb7-fc5f-47b1-b2c1-03b66cab3b92" (UID: "d5413bb7-fc5f-47b1-b2c1-03b66cab3b92"). InnerVolumeSpecName "kube-api-access-sp6gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.027058 4751 generic.go:334] "Generic (PLEG): container finished" podID="1389c328-4d38-4b2a-b8ca-54e69fa59035" containerID="13360b1a7e8e5c7cf3f680c1b3d28fa68e7da42d8b3488ab7040dfc6ff71dffb" exitCode=0 Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.027159 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7q6w6" event={"ID":"1389c328-4d38-4b2a-b8ca-54e69fa59035","Type":"ContainerDied","Data":"13360b1a7e8e5c7cf3f680c1b3d28fa68e7da42d8b3488ab7040dfc6ff71dffb"} Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.029013 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-70be-account-create-update-rq9wc" event={"ID":"27b5d97a-cd82-45be-ae36-bd97f293b7cd","Type":"ContainerDied","Data":"9edcbd2668b39cad85f7dcbd842667eda18b4e5efafe3a1302dea745b280706c"} Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.029044 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9edcbd2668b39cad85f7dcbd842667eda18b4e5efafe3a1302dea745b280706c" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.029052 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-70be-account-create-update-rq9wc" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.038099 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e7ba-account-create-update-nnclp" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.038105 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e7ba-account-create-update-nnclp" event={"ID":"d5413bb7-fc5f-47b1-b2c1-03b66cab3b92","Type":"ContainerDied","Data":"94a876559e0cc4c2554b83652d2817f9e47664b7c64e1680e9fd6d1c72abc353"} Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.038263 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94a876559e0cc4c2554b83652d2817f9e47664b7c64e1680e9fd6d1c72abc353" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.042423 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4brr7\" (UniqueName: \"kubernetes.io/projected/d58931ec-41c3-417e-b1b6-23d8855a0dbd-kube-api-access-4brr7\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.042457 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp6gb\" (UniqueName: \"kubernetes.io/projected/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-kube-api-access-sp6gb\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.042468 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xn4w\" (UniqueName: \"kubernetes.io/projected/638fec0e-031b-4c73-828b-95157a9dd522-kube-api-access-7xn4w\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.042478 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d58931ec-41c3-417e-b1b6-23d8855a0dbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.042487 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/638fec0e-031b-4c73-828b-95157a9dd522-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.042496 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.042671 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rrw72" event={"ID":"638fec0e-031b-4c73-828b-95157a9dd522","Type":"ContainerDied","Data":"3b0113e9e530d9a519a1002915d185ff046b23e122d72cbb5ed334773e7f60b8"} Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.042698 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b0113e9e530d9a519a1002915d185ff046b23e122d72cbb5ed334773e7f60b8" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.042759 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rrw72" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.046217 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-78w4f" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.046211 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-78w4f" event={"ID":"d58931ec-41c3-417e-b1b6-23d8855a0dbd","Type":"ContainerDied","Data":"ba8bf71e302b902bb499ce2c7c9b7540baa3039ad0544edb1a3888af7d0d7a08"} Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.046334 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba8bf71e302b902bb499ce2c7c9b7540baa3039ad0544edb1a3888af7d0d7a08" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.048514 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerStarted","Data":"91d1936becde3283c4719c7813914c6efc7a7a65473e740d04ffad967792d0bd"} Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.335230 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.449664 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tcwt\" (UniqueName: \"kubernetes.io/projected/24b70371-2bd9-44dc-a70a-c522ffb2125a-kube-api-access-8tcwt\") pod \"24b70371-2bd9-44dc-a70a-c522ffb2125a\" (UID: \"24b70371-2bd9-44dc-a70a-c522ffb2125a\") " Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.449852 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b70371-2bd9-44dc-a70a-c522ffb2125a-operator-scripts\") pod \"24b70371-2bd9-44dc-a70a-c522ffb2125a\" (UID: \"24b70371-2bd9-44dc-a70a-c522ffb2125a\") " Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.450407 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24b70371-2bd9-44dc-a70a-c522ffb2125a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24b70371-2bd9-44dc-a70a-c522ffb2125a" (UID: "24b70371-2bd9-44dc-a70a-c522ffb2125a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.454357 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b70371-2bd9-44dc-a70a-c522ffb2125a-kube-api-access-8tcwt" (OuterVolumeSpecName: "kube-api-access-8tcwt") pod "24b70371-2bd9-44dc-a70a-c522ffb2125a" (UID: "24b70371-2bd9-44dc-a70a-c522ffb2125a"). InnerVolumeSpecName "kube-api-access-8tcwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.455237 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b70371-2bd9-44dc-a70a-c522ffb2125a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:22 crc kubenswrapper[4751]: I1203 14:34:22.455275 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tcwt\" (UniqueName: \"kubernetes.io/projected/24b70371-2bd9-44dc-a70a-c522ffb2125a-kube-api-access-8tcwt\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.058353 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bddc-account-create-update-zv8f2" event={"ID":"24b70371-2bd9-44dc-a70a-c522ffb2125a","Type":"ContainerDied","Data":"df1d6a14f83cf02e89a252d4779e5e6079c3af8163c7e73322080194f7040c2e"} Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.058713 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df1d6a14f83cf02e89a252d4779e5e6079c3af8163c7e73322080194f7040c2e" Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.058452 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bddc-account-create-update-zv8f2" Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.371938 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:23 crc kubenswrapper[4751]: E1203 14:34:23.372150 4751 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:34:23 crc kubenswrapper[4751]: E1203 14:34:23.372168 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:34:23 crc kubenswrapper[4751]: E1203 14:34:23.372215 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift podName:a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2 nodeName:}" failed. No retries permitted until 2025-12-03 14:34:39.372197332 +0000 UTC m=+1286.360552549 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift") pod "swift-storage-0" (UID: "a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2") : configmap "swift-ring-files" not found Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.442868 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.574826 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1389c328-4d38-4b2a-b8ca-54e69fa59035-operator-scripts\") pod \"1389c328-4d38-4b2a-b8ca-54e69fa59035\" (UID: \"1389c328-4d38-4b2a-b8ca-54e69fa59035\") " Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.575161 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjdmw\" (UniqueName: \"kubernetes.io/projected/1389c328-4d38-4b2a-b8ca-54e69fa59035-kube-api-access-gjdmw\") pod \"1389c328-4d38-4b2a-b8ca-54e69fa59035\" (UID: \"1389c328-4d38-4b2a-b8ca-54e69fa59035\") " Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.576045 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1389c328-4d38-4b2a-b8ca-54e69fa59035-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1389c328-4d38-4b2a-b8ca-54e69fa59035" (UID: "1389c328-4d38-4b2a-b8ca-54e69fa59035"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.581100 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1389c328-4d38-4b2a-b8ca-54e69fa59035-kube-api-access-gjdmw" (OuterVolumeSpecName: "kube-api-access-gjdmw") pod "1389c328-4d38-4b2a-b8ca-54e69fa59035" (UID: "1389c328-4d38-4b2a-b8ca-54e69fa59035"). InnerVolumeSpecName "kube-api-access-gjdmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.677254 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjdmw\" (UniqueName: \"kubernetes.io/projected/1389c328-4d38-4b2a-b8ca-54e69fa59035-kube-api-access-gjdmw\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:23 crc kubenswrapper[4751]: I1203 14:34:23.677293 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1389c328-4d38-4b2a-b8ca-54e69fa59035-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:24 crc kubenswrapper[4751]: I1203 14:34:24.068040 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7q6w6" event={"ID":"1389c328-4d38-4b2a-b8ca-54e69fa59035","Type":"ContainerDied","Data":"431d56c624c9ce7cb7242bb60460bc6264f5cbf2a9d2ec0677ce3f05dd7727ce"} Dec 03 14:34:24 crc kubenswrapper[4751]: I1203 14:34:24.068079 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="431d56c624c9ce7cb7242bb60460bc6264f5cbf2a9d2ec0677ce3f05dd7727ce" Dec 03 14:34:24 crc kubenswrapper[4751]: I1203 14:34:24.068100 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7q6w6" Dec 03 14:34:24 crc kubenswrapper[4751]: I1203 14:34:24.581002 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" Dec 03 14:34:24 crc kubenswrapper[4751]: I1203 14:34:24.763298 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-548665d79b-8226l" Dec 03 14:34:24 crc kubenswrapper[4751]: I1203 14:34:24.858099 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-r2j44" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.076815 4751 generic.go:334] "Generic (PLEG): container finished" podID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerID="5f35e9d06325ea3615762c01e04cf56308efaa687e0631b5a77396e8c64782f7" exitCode=0 Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.076853 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1da8e9b-0799-4327-9e24-216c4a51fde2","Type":"ContainerDied","Data":"5f35e9d06325ea3615762c01e04cf56308efaa687e0631b5a77396e8c64782f7"} Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.522810 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.737747 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="04053d51-dddf-43e3-a230-9ac729dec435" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.789767 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 14:34:25 crc kubenswrapper[4751]: E1203 14:34:25.790100 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b5d97a-cd82-45be-ae36-bd97f293b7cd" containerName="mariadb-account-create-update" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790116 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b5d97a-cd82-45be-ae36-bd97f293b7cd" containerName="mariadb-account-create-update" Dec 03 14:34:25 crc kubenswrapper[4751]: E1203 14:34:25.790130 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b70371-2bd9-44dc-a70a-c522ffb2125a" containerName="mariadb-account-create-update" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790138 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b70371-2bd9-44dc-a70a-c522ffb2125a" containerName="mariadb-account-create-update" Dec 03 14:34:25 crc kubenswrapper[4751]: E1203 14:34:25.790155 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58931ec-41c3-417e-b1b6-23d8855a0dbd" containerName="mariadb-database-create" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790161 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58931ec-41c3-417e-b1b6-23d8855a0dbd" containerName="mariadb-database-create" Dec 03 14:34:25 crc kubenswrapper[4751]: E1203 14:34:25.790170 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="638fec0e-031b-4c73-828b-95157a9dd522" containerName="mariadb-database-create" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790177 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="638fec0e-031b-4c73-828b-95157a9dd522" containerName="mariadb-database-create" Dec 03 14:34:25 crc kubenswrapper[4751]: E1203 14:34:25.790187 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1389c328-4d38-4b2a-b8ca-54e69fa59035" containerName="mariadb-database-create" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790194 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="1389c328-4d38-4b2a-b8ca-54e69fa59035" containerName="mariadb-database-create" Dec 03 14:34:25 crc kubenswrapper[4751]: E1203 14:34:25.790205 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5413bb7-fc5f-47b1-b2c1-03b66cab3b92" containerName="mariadb-account-create-update" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790211 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5413bb7-fc5f-47b1-b2c1-03b66cab3b92" containerName="mariadb-account-create-update" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790403 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b70371-2bd9-44dc-a70a-c522ffb2125a" containerName="mariadb-account-create-update" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790421 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="638fec0e-031b-4c73-828b-95157a9dd522" containerName="mariadb-database-create" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790430 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b5d97a-cd82-45be-ae36-bd97f293b7cd" containerName="mariadb-account-create-update" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790440 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58931ec-41c3-417e-b1b6-23d8855a0dbd" containerName="mariadb-database-create" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790447 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5413bb7-fc5f-47b1-b2c1-03b66cab3b92" containerName="mariadb-account-create-update" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.790457 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="1389c328-4d38-4b2a-b8ca-54e69fa59035" containerName="mariadb-database-create" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.791377 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.796718 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.796896 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.797005 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-d59gt" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.797117 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.815732 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.846549 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.931883 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ffebbc-a033-4a04-a133-d90456a57881-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.931928 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ffebbc-a033-4a04-a133-d90456a57881-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.931982 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7ffebbc-a033-4a04-a133-d90456a57881-config\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.932002 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ffebbc-a033-4a04-a133-d90456a57881-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.932023 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e7ffebbc-a033-4a04-a133-d90456a57881-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.932054 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7gg2\" (UniqueName: \"kubernetes.io/projected/e7ffebbc-a033-4a04-a133-d90456a57881-kube-api-access-w7gg2\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:25 crc kubenswrapper[4751]: I1203 14:34:25.932076 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7ffebbc-a033-4a04-a133-d90456a57881-scripts\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.034391 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ffebbc-a033-4a04-a133-d90456a57881-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.034440 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ffebbc-a033-4a04-a133-d90456a57881-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.034495 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7ffebbc-a033-4a04-a133-d90456a57881-config\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.034516 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ffebbc-a033-4a04-a133-d90456a57881-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.034543 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e7ffebbc-a033-4a04-a133-d90456a57881-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.034578 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7gg2\" (UniqueName: \"kubernetes.io/projected/e7ffebbc-a033-4a04-a133-d90456a57881-kube-api-access-w7gg2\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.034608 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7ffebbc-a033-4a04-a133-d90456a57881-scripts\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.038433 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7ffebbc-a033-4a04-a133-d90456a57881-config\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.038898 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e7ffebbc-a033-4a04-a133-d90456a57881-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.039699 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7ffebbc-a033-4a04-a133-d90456a57881-scripts\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.060072 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ffebbc-a033-4a04-a133-d90456a57881-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.063884 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ffebbc-a033-4a04-a133-d90456a57881-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.067120 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ffebbc-a033-4a04-a133-d90456a57881-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.083052 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7gg2\" (UniqueName: \"kubernetes.io/projected/e7ffebbc-a033-4a04-a133-d90456a57881-kube-api-access-w7gg2\") pod \"ovn-northd-0\" (UID: \"e7ffebbc-a033-4a04-a133-d90456a57881\") " pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.118411 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1da8e9b-0799-4327-9e24-216c4a51fde2","Type":"ContainerStarted","Data":"a8c83b1d5f1a85ae4bfd25ddb57357a3145271a6c2fb53b90694087de6ba2f1d"} Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.118732 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.120819 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.168377 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=63.987675186 podStartE2EDuration="1m27.168355943s" podCreationTimestamp="2025-12-03 14:32:59 +0000 UTC" firstStartedPulling="2025-12-03 14:33:26.763068559 +0000 UTC m=+1213.751423776" lastFinishedPulling="2025-12-03 14:33:49.943749316 +0000 UTC m=+1236.932104533" observedRunningTime="2025-12-03 14:34:26.161427103 +0000 UTC m=+1273.149782320" watchObservedRunningTime="2025-12-03 14:34:26.168355943 +0000 UTC m=+1273.156711160" Dec 03 14:34:26 crc kubenswrapper[4751]: I1203 14:34:26.626804 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 14:34:28 crc kubenswrapper[4751]: I1203 14:34:28.135126 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerStarted","Data":"c9113d83ad6a943678f3e8746b6c9d9d26a4651e2c15e3aaa6751ba3662cde95"} Dec 03 14:34:28 crc kubenswrapper[4751]: I1203 14:34:28.137645 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e7ffebbc-a033-4a04-a133-d90456a57881","Type":"ContainerStarted","Data":"af358715945665418669c94e7ac3a77dd317650273cb6f8bdae8b830279a3c6d"} Dec 03 14:34:28 crc kubenswrapper[4751]: I1203 14:34:28.170042 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.276307972 podStartE2EDuration="1m22.170019112s" podCreationTimestamp="2025-12-03 14:33:06 +0000 UTC" firstStartedPulling="2025-12-03 14:33:26.789994821 +0000 UTC m=+1213.778350038" lastFinishedPulling="2025-12-03 14:34:27.683705961 +0000 UTC m=+1274.672061178" observedRunningTime="2025-12-03 14:34:28.16195883 +0000 UTC m=+1275.150314057" watchObservedRunningTime="2025-12-03 14:34:28.170019112 +0000 UTC m=+1275.158374329" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.146995 4751 generic.go:334] "Generic (PLEG): container finished" podID="0fedaa81-0c36-44fa-ab7b-b712759fc8d4" containerID="aaa8be54f51aea2b3f2622fd12cc233a7e8d74a81f3b2648602ff832bc8227cf" exitCode=0 Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.147095 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-68rzl" event={"ID":"0fedaa81-0c36-44fa-ab7b-b712759fc8d4","Type":"ContainerDied","Data":"aaa8be54f51aea2b3f2622fd12cc233a7e8d74a81f3b2648602ff832bc8227cf"} Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.150009 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e7ffebbc-a033-4a04-a133-d90456a57881","Type":"ContainerStarted","Data":"9f7b19917b8fe0a9e2b985e41dc220bca8ffbe979487e08032985d003add22ca"} Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.150051 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e7ffebbc-a033-4a04-a133-d90456a57881","Type":"ContainerStarted","Data":"aa20c46b71ec9e720c0759b1a60a88c315977cd9b0d69cf96e9474189fc66d82"} Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.150418 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.188406 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.206744714 podStartE2EDuration="4.188386565s" podCreationTimestamp="2025-12-03 14:34:25 +0000 UTC" firstStartedPulling="2025-12-03 14:34:27.603928783 +0000 UTC m=+1274.592284000" lastFinishedPulling="2025-12-03 14:34:28.585570634 +0000 UTC m=+1275.573925851" observedRunningTime="2025-12-03 14:34:29.184119448 +0000 UTC m=+1276.172474665" watchObservedRunningTime="2025-12-03 14:34:29.188386565 +0000 UTC m=+1276.176741782" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.577901 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hmq8c"] Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.579268 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.583139 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.583797 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zbls2" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.595232 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hmq8c"] Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.703090 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-config-data\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.703187 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mbrx\" (UniqueName: \"kubernetes.io/projected/5ea81b69-95de-4772-b7bf-d48f52c298b1-kube-api-access-7mbrx\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.703222 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-db-sync-config-data\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.703259 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-combined-ca-bundle\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.771638 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lqzrd" podUID="7ab1fa90-b8eb-405d-803d-b9fd84939289" containerName="ovn-controller" probeResult="failure" output=< Dec 03 14:34:29 crc kubenswrapper[4751]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 14:34:29 crc kubenswrapper[4751]: > Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.805554 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-config-data\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.805687 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mbrx\" (UniqueName: \"kubernetes.io/projected/5ea81b69-95de-4772-b7bf-d48f52c298b1-kube-api-access-7mbrx\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.805727 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-db-sync-config-data\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.805778 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-combined-ca-bundle\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.810254 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-config-data\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.811067 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-combined-ca-bundle\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.823791 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mbrx\" (UniqueName: \"kubernetes.io/projected/5ea81b69-95de-4772-b7bf-d48f52c298b1-kube-api-access-7mbrx\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.832056 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-db-sync-config-data\") pod \"glance-db-sync-hmq8c\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.869153 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.875533 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dzz9c" Dec 03 14:34:29 crc kubenswrapper[4751]: I1203 14:34:29.902287 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hmq8c" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.133453 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lqzrd-config-28ztl"] Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.142288 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.145731 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.154573 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lqzrd-config-28ztl"] Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.228409 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-log-ovn\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.228524 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-additional-scripts\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.228563 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.228643 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzj2\" (UniqueName: \"kubernetes.io/projected/52bd7e3e-0190-4154-96fc-4ca336745afd-kube-api-access-sdzj2\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.228819 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-scripts\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.228848 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run-ovn\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.330256 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-additional-scripts\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.330305 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.330354 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzj2\" (UniqueName: \"kubernetes.io/projected/52bd7e3e-0190-4154-96fc-4ca336745afd-kube-api-access-sdzj2\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.330425 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-scripts\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.330443 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run-ovn\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.330502 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-log-ovn\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.330738 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-log-ovn\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.331396 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-additional-scripts\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.331445 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.333246 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-scripts\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.333305 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run-ovn\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.369016 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzj2\" (UniqueName: \"kubernetes.io/projected/52bd7e3e-0190-4154-96fc-4ca336745afd-kube-api-access-sdzj2\") pod \"ovn-controller-lqzrd-config-28ztl\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.480598 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.487283 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.533380 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlf99\" (UniqueName: \"kubernetes.io/projected/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-kube-api-access-dlf99\") pod \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.533524 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-swiftconf\") pod \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.533582 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-ring-data-devices\") pod \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.533636 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-dispersionconf\") pod \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.533712 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-etc-swift\") pod \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.533743 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-scripts\") pod \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.533823 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-combined-ca-bundle\") pod \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\" (UID: \"0fedaa81-0c36-44fa-ab7b-b712759fc8d4\") " Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.537511 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hmq8c"] Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.537595 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-kube-api-access-dlf99" (OuterVolumeSpecName: "kube-api-access-dlf99") pod "0fedaa81-0c36-44fa-ab7b-b712759fc8d4" (UID: "0fedaa81-0c36-44fa-ab7b-b712759fc8d4"). InnerVolumeSpecName "kube-api-access-dlf99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.538537 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0fedaa81-0c36-44fa-ab7b-b712759fc8d4" (UID: "0fedaa81-0c36-44fa-ab7b-b712759fc8d4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.539913 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0fedaa81-0c36-44fa-ab7b-b712759fc8d4" (UID: "0fedaa81-0c36-44fa-ab7b-b712759fc8d4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.546548 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0fedaa81-0c36-44fa-ab7b-b712759fc8d4" (UID: "0fedaa81-0c36-44fa-ab7b-b712759fc8d4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:34:30 crc kubenswrapper[4751]: W1203 14:34:30.555630 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ea81b69_95de_4772_b7bf_d48f52c298b1.slice/crio-663c56efe52dcafd7e7c5c86c7359b14d7a213396e7c36f94fe5e5256e49e805 WatchSource:0}: Error finding container 663c56efe52dcafd7e7c5c86c7359b14d7a213396e7c36f94fe5e5256e49e805: Status 404 returned error can't find the container with id 663c56efe52dcafd7e7c5c86c7359b14d7a213396e7c36f94fe5e5256e49e805 Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.566851 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-scripts" (OuterVolumeSpecName: "scripts") pod "0fedaa81-0c36-44fa-ab7b-b712759fc8d4" (UID: "0fedaa81-0c36-44fa-ab7b-b712759fc8d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.572356 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0fedaa81-0c36-44fa-ab7b-b712759fc8d4" (UID: "0fedaa81-0c36-44fa-ab7b-b712759fc8d4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.577106 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fedaa81-0c36-44fa-ab7b-b712759fc8d4" (UID: "0fedaa81-0c36-44fa-ab7b-b712759fc8d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.635722 4751 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.635752 4751 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.635765 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.635777 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.635789 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlf99\" (UniqueName: \"kubernetes.io/projected/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-kube-api-access-dlf99\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.635802 4751 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.635813 4751 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0fedaa81-0c36-44fa-ab7b-b712759fc8d4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:30 crc kubenswrapper[4751]: W1203 14:34:30.964317 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52bd7e3e_0190_4154_96fc_4ca336745afd.slice/crio-bd7ffb0b51976fbb20b288e8640a34cd4892e2775346af9c3d98c44e6e6d23eb WatchSource:0}: Error finding container bd7ffb0b51976fbb20b288e8640a34cd4892e2775346af9c3d98c44e6e6d23eb: Status 404 returned error can't find the container with id bd7ffb0b51976fbb20b288e8640a34cd4892e2775346af9c3d98c44e6e6d23eb Dec 03 14:34:30 crc kubenswrapper[4751]: I1203 14:34:30.971502 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lqzrd-config-28ztl"] Dec 03 14:34:31 crc kubenswrapper[4751]: I1203 14:34:31.171354 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hmq8c" event={"ID":"5ea81b69-95de-4772-b7bf-d48f52c298b1","Type":"ContainerStarted","Data":"663c56efe52dcafd7e7c5c86c7359b14d7a213396e7c36f94fe5e5256e49e805"} Dec 03 14:34:31 crc kubenswrapper[4751]: I1203 14:34:31.173484 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-68rzl" Dec 03 14:34:31 crc kubenswrapper[4751]: I1203 14:34:31.173487 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-68rzl" event={"ID":"0fedaa81-0c36-44fa-ab7b-b712759fc8d4","Type":"ContainerDied","Data":"bba03347b16c2e6e27147069d38f83722fff844b6b9bfff65ccc74fb5257979e"} Dec 03 14:34:31 crc kubenswrapper[4751]: I1203 14:34:31.173549 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba03347b16c2e6e27147069d38f83722fff844b6b9bfff65ccc74fb5257979e" Dec 03 14:34:31 crc kubenswrapper[4751]: I1203 14:34:31.174967 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lqzrd-config-28ztl" event={"ID":"52bd7e3e-0190-4154-96fc-4ca336745afd","Type":"ContainerStarted","Data":"bd7ffb0b51976fbb20b288e8640a34cd4892e2775346af9c3d98c44e6e6d23eb"} Dec 03 14:34:32 crc kubenswrapper[4751]: I1203 14:34:32.185366 4751 generic.go:334] "Generic (PLEG): container finished" podID="52bd7e3e-0190-4154-96fc-4ca336745afd" containerID="249a102ad07502d29f0d959a39bdbf77b8cd09f8449eba408c5c841af432f912" exitCode=0 Dec 03 14:34:32 crc kubenswrapper[4751]: I1203 14:34:32.185721 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lqzrd-config-28ztl" event={"ID":"52bd7e3e-0190-4154-96fc-4ca336745afd","Type":"ContainerDied","Data":"249a102ad07502d29f0d959a39bdbf77b8cd09f8449eba408c5c841af432f912"} Dec 03 14:34:32 crc kubenswrapper[4751]: I1203 14:34:32.491432 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.195257 4751 generic.go:334] "Generic (PLEG): container finished" podID="47b63367-ad69-4428-9c79-8eee86b817ac" containerID="dc5a97df8731908a544fde7ee0074065ca6236a194a83bc37dc69e7162e2641e" exitCode=0 Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.195369 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"47b63367-ad69-4428-9c79-8eee86b817ac","Type":"ContainerDied","Data":"dc5a97df8731908a544fde7ee0074065ca6236a194a83bc37dc69e7162e2641e"} Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.632205 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.690916 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run\") pod \"52bd7e3e-0190-4154-96fc-4ca336745afd\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.691251 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-scripts\") pod \"52bd7e3e-0190-4154-96fc-4ca336745afd\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.691110 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run" (OuterVolumeSpecName: "var-run") pod "52bd7e3e-0190-4154-96fc-4ca336745afd" (UID: "52bd7e3e-0190-4154-96fc-4ca336745afd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.691316 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdzj2\" (UniqueName: \"kubernetes.io/projected/52bd7e3e-0190-4154-96fc-4ca336745afd-kube-api-access-sdzj2\") pod \"52bd7e3e-0190-4154-96fc-4ca336745afd\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.691488 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-log-ovn\") pod \"52bd7e3e-0190-4154-96fc-4ca336745afd\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.691509 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run-ovn\") pod \"52bd7e3e-0190-4154-96fc-4ca336745afd\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.691568 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-additional-scripts\") pod \"52bd7e3e-0190-4154-96fc-4ca336745afd\" (UID: \"52bd7e3e-0190-4154-96fc-4ca336745afd\") " Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.691562 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "52bd7e3e-0190-4154-96fc-4ca336745afd" (UID: "52bd7e3e-0190-4154-96fc-4ca336745afd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.691625 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "52bd7e3e-0190-4154-96fc-4ca336745afd" (UID: "52bd7e3e-0190-4154-96fc-4ca336745afd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.691999 4751 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.692016 4751 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.692031 4751 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52bd7e3e-0190-4154-96fc-4ca336745afd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.692283 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "52bd7e3e-0190-4154-96fc-4ca336745afd" (UID: "52bd7e3e-0190-4154-96fc-4ca336745afd"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.692539 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-scripts" (OuterVolumeSpecName: "scripts") pod "52bd7e3e-0190-4154-96fc-4ca336745afd" (UID: "52bd7e3e-0190-4154-96fc-4ca336745afd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.696862 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52bd7e3e-0190-4154-96fc-4ca336745afd-kube-api-access-sdzj2" (OuterVolumeSpecName: "kube-api-access-sdzj2") pod "52bd7e3e-0190-4154-96fc-4ca336745afd" (UID: "52bd7e3e-0190-4154-96fc-4ca336745afd"). InnerVolumeSpecName "kube-api-access-sdzj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.794110 4751 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.794156 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52bd7e3e-0190-4154-96fc-4ca336745afd-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:33 crc kubenswrapper[4751]: I1203 14:34:33.794166 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdzj2\" (UniqueName: \"kubernetes.io/projected/52bd7e3e-0190-4154-96fc-4ca336745afd-kube-api-access-sdzj2\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:34 crc kubenswrapper[4751]: I1203 14:34:34.212499 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"47b63367-ad69-4428-9c79-8eee86b817ac","Type":"ContainerStarted","Data":"e31623ebc89c82e42314f91e2333489512ed54bb1026dab75b7fff94b80c1f8a"} Dec 03 14:34:34 crc kubenswrapper[4751]: I1203 14:34:34.212743 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 14:34:34 crc kubenswrapper[4751]: I1203 14:34:34.214848 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lqzrd-config-28ztl" event={"ID":"52bd7e3e-0190-4154-96fc-4ca336745afd","Type":"ContainerDied","Data":"bd7ffb0b51976fbb20b288e8640a34cd4892e2775346af9c3d98c44e6e6d23eb"} Dec 03 14:34:34 crc kubenswrapper[4751]: I1203 14:34:34.214897 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd7ffb0b51976fbb20b288e8640a34cd4892e2775346af9c3d98c44e6e6d23eb" Dec 03 14:34:34 crc kubenswrapper[4751]: I1203 14:34:34.214960 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lqzrd-config-28ztl" Dec 03 14:34:34 crc kubenswrapper[4751]: I1203 14:34:34.239964 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371941.614845 podStartE2EDuration="1m35.239930917s" podCreationTimestamp="2025-12-03 14:32:59 +0000 UTC" firstStartedPulling="2025-12-03 14:33:27.839978245 +0000 UTC m=+1214.828333462" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:34.237899741 +0000 UTC m=+1281.226254968" watchObservedRunningTime="2025-12-03 14:34:34.239930917 +0000 UTC m=+1281.228286134" Dec 03 14:34:34 crc kubenswrapper[4751]: I1203 14:34:34.745458 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lqzrd-config-28ztl"] Dec 03 14:34:34 crc kubenswrapper[4751]: I1203 14:34:34.758351 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lqzrd-config-28ztl"] Dec 03 14:34:34 crc kubenswrapper[4751]: I1203 14:34:34.787934 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lqzrd" Dec 03 14:34:35 crc kubenswrapper[4751]: I1203 14:34:35.332548 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52bd7e3e-0190-4154-96fc-4ca336745afd" path="/var/lib/kubelet/pods/52bd7e3e-0190-4154-96fc-4ca336745afd/volumes" Dec 03 14:34:35 crc kubenswrapper[4751]: I1203 14:34:35.738634 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="04053d51-dddf-43e3-a230-9ac729dec435" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:34:35 crc kubenswrapper[4751]: I1203 14:34:35.819494 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:34:35 crc kubenswrapper[4751]: I1203 14:34:35.819561 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:34:35 crc kubenswrapper[4751]: I1203 14:34:35.819603 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:34:35 crc kubenswrapper[4751]: I1203 14:34:35.820307 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"013b499465da11b00f7b510304fcaff215703026384eae17787f3651933e4e4f"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:34:35 crc kubenswrapper[4751]: I1203 14:34:35.820387 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://013b499465da11b00f7b510304fcaff215703026384eae17787f3651933e4e4f" gracePeriod=600 Dec 03 14:34:36 crc kubenswrapper[4751]: I1203 14:34:36.233149 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="013b499465da11b00f7b510304fcaff215703026384eae17787f3651933e4e4f" exitCode=0 Dec 03 14:34:36 crc kubenswrapper[4751]: I1203 14:34:36.233388 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"013b499465da11b00f7b510304fcaff215703026384eae17787f3651933e4e4f"} Dec 03 14:34:36 crc kubenswrapper[4751]: I1203 14:34:36.233494 4751 scope.go:117] "RemoveContainer" containerID="6554aa9d5e7898bf5e07fe04c4800b61a41046f1b56d94f87ff1d09b45063fa3" Dec 03 14:34:37 crc kubenswrapper[4751]: I1203 14:34:37.490764 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:37 crc kubenswrapper[4751]: I1203 14:34:37.493033 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:38 crc kubenswrapper[4751]: I1203 14:34:38.253285 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:39 crc kubenswrapper[4751]: I1203 14:34:39.398282 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:39 crc kubenswrapper[4751]: I1203 14:34:39.405073 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2-etc-swift\") pod \"swift-storage-0\" (UID: \"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2\") " pod="openstack/swift-storage-0" Dec 03 14:34:39 crc kubenswrapper[4751]: I1203 14:34:39.602765 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 14:34:41 crc kubenswrapper[4751]: I1203 14:34:41.145728 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 14:34:41 crc kubenswrapper[4751]: I1203 14:34:41.147570 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="thanos-sidecar" containerID="cri-o://c9113d83ad6a943678f3e8746b6c9d9d26a4651e2c15e3aaa6751ba3662cde95" gracePeriod=600 Dec 03 14:34:41 crc kubenswrapper[4751]: I1203 14:34:41.147719 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="prometheus" containerID="cri-o://798a25e0fbe45250e4064ec82ae98343730da2aa5dc7e19451e9af976b8235f2" gracePeriod=600 Dec 03 14:34:41 crc kubenswrapper[4751]: I1203 14:34:41.147734 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="config-reloader" containerID="cri-o://91d1936becde3283c4719c7813914c6efc7a7a65473e740d04ffad967792d0bd" gracePeriod=600 Dec 03 14:34:41 crc kubenswrapper[4751]: I1203 14:34:41.204538 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 14:34:41 crc kubenswrapper[4751]: I1203 14:34:41.217973 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:34:42 crc kubenswrapper[4751]: I1203 14:34:42.491569 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.115:9090/-/ready\": dial tcp 10.217.0.115:9090: connect: connection refused" Dec 03 14:34:43 crc kubenswrapper[4751]: I1203 14:34:43.310008 4751 generic.go:334] "Generic (PLEG): container finished" podID="9de5857f-8fe8-48e3-991b-7171fc510567" containerID="c9113d83ad6a943678f3e8746b6c9d9d26a4651e2c15e3aaa6751ba3662cde95" exitCode=0 Dec 03 14:34:43 crc kubenswrapper[4751]: I1203 14:34:43.310034 4751 generic.go:334] "Generic (PLEG): container finished" podID="9de5857f-8fe8-48e3-991b-7171fc510567" containerID="91d1936becde3283c4719c7813914c6efc7a7a65473e740d04ffad967792d0bd" exitCode=0 Dec 03 14:34:43 crc kubenswrapper[4751]: I1203 14:34:43.310041 4751 generic.go:334] "Generic (PLEG): container finished" podID="9de5857f-8fe8-48e3-991b-7171fc510567" containerID="798a25e0fbe45250e4064ec82ae98343730da2aa5dc7e19451e9af976b8235f2" exitCode=0 Dec 03 14:34:43 crc kubenswrapper[4751]: I1203 14:34:43.310059 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerDied","Data":"c9113d83ad6a943678f3e8746b6c9d9d26a4651e2c15e3aaa6751ba3662cde95"} Dec 03 14:34:43 crc kubenswrapper[4751]: I1203 14:34:43.310082 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerDied","Data":"91d1936becde3283c4719c7813914c6efc7a7a65473e740d04ffad967792d0bd"} Dec 03 14:34:43 crc kubenswrapper[4751]: I1203 14:34:43.310093 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerDied","Data":"798a25e0fbe45250e4064ec82ae98343730da2aa5dc7e19451e9af976b8235f2"} Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.108149 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.223656 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9jnj\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-kube-api-access-d9jnj\") pod \"9de5857f-8fe8-48e3-991b-7171fc510567\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.223887 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") pod \"9de5857f-8fe8-48e3-991b-7171fc510567\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.224110 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-thanos-prometheus-http-client-file\") pod \"9de5857f-8fe8-48e3-991b-7171fc510567\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.224171 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-tls-assets\") pod \"9de5857f-8fe8-48e3-991b-7171fc510567\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.224469 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-config\") pod \"9de5857f-8fe8-48e3-991b-7171fc510567\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.224516 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9de5857f-8fe8-48e3-991b-7171fc510567-prometheus-metric-storage-rulefiles-0\") pod \"9de5857f-8fe8-48e3-991b-7171fc510567\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.224544 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-web-config\") pod \"9de5857f-8fe8-48e3-991b-7171fc510567\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.224586 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9de5857f-8fe8-48e3-991b-7171fc510567-config-out\") pod \"9de5857f-8fe8-48e3-991b-7171fc510567\" (UID: \"9de5857f-8fe8-48e3-991b-7171fc510567\") " Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.229764 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9de5857f-8fe8-48e3-991b-7171fc510567" (UID: "9de5857f-8fe8-48e3-991b-7171fc510567"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.232028 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-kube-api-access-d9jnj" (OuterVolumeSpecName: "kube-api-access-d9jnj") pod "9de5857f-8fe8-48e3-991b-7171fc510567" (UID: "9de5857f-8fe8-48e3-991b-7171fc510567"). InnerVolumeSpecName "kube-api-access-d9jnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.235174 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de5857f-8fe8-48e3-991b-7171fc510567-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "9de5857f-8fe8-48e3-991b-7171fc510567" (UID: "9de5857f-8fe8-48e3-991b-7171fc510567"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.235770 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9de5857f-8fe8-48e3-991b-7171fc510567" (UID: "9de5857f-8fe8-48e3-991b-7171fc510567"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.237275 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9de5857f-8fe8-48e3-991b-7171fc510567-config-out" (OuterVolumeSpecName: "config-out") pod "9de5857f-8fe8-48e3-991b-7171fc510567" (UID: "9de5857f-8fe8-48e3-991b-7171fc510567"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.240589 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-config" (OuterVolumeSpecName: "config") pod "9de5857f-8fe8-48e3-991b-7171fc510567" (UID: "9de5857f-8fe8-48e3-991b-7171fc510567"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.264550 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-web-config" (OuterVolumeSpecName: "web-config") pod "9de5857f-8fe8-48e3-991b-7171fc510567" (UID: "9de5857f-8fe8-48e3-991b-7171fc510567"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.283003 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "9de5857f-8fe8-48e3-991b-7171fc510567" (UID: "9de5857f-8fe8-48e3-991b-7171fc510567"). InnerVolumeSpecName "pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.327169 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.327207 4751 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9de5857f-8fe8-48e3-991b-7171fc510567-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.327221 4751 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-web-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.327233 4751 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9de5857f-8fe8-48e3-991b-7171fc510567-config-out\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.327246 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9jnj\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-kube-api-access-d9jnj\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.327283 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") on node \"crc\" " Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.327298 4751 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9de5857f-8fe8-48e3-991b-7171fc510567-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.327309 4751 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9de5857f-8fe8-48e3-991b-7171fc510567-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.335199 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9de5857f-8fe8-48e3-991b-7171fc510567","Type":"ContainerDied","Data":"335fa198823f6346aec8e3791f5248ac12a0eafe2f8d7c0122d9dc8c51cf8839"} Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.335220 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.335384 4751 scope.go:117] "RemoveContainer" containerID="c9113d83ad6a943678f3e8746b6c9d9d26a4651e2c15e3aaa6751ba3662cde95" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.339964 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"8513ef227e39ef06a8d05cad17c9635fc3ec8cf5ec5acd20288a621754b77ca6"} Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.363321 4751 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.364369 4751 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3") on node "crc" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.385141 4751 scope.go:117] "RemoveContainer" containerID="91d1936becde3283c4719c7813914c6efc7a7a65473e740d04ffad967792d0bd" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.394377 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.416919 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.431609 4751 reconciler_common.go:293] "Volume detached for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") on node \"crc\" DevicePath \"\"" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.451698 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 14:34:45 crc kubenswrapper[4751]: E1203 14:34:45.452062 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="init-config-reloader" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.452080 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="init-config-reloader" Dec 03 14:34:45 crc kubenswrapper[4751]: E1203 14:34:45.452091 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bd7e3e-0190-4154-96fc-4ca336745afd" containerName="ovn-config" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.452097 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bd7e3e-0190-4154-96fc-4ca336745afd" containerName="ovn-config" Dec 03 14:34:45 crc kubenswrapper[4751]: E1203 14:34:45.452108 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="prometheus" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.452114 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="prometheus" Dec 03 14:34:45 crc kubenswrapper[4751]: E1203 14:34:45.452128 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="config-reloader" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.452133 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="config-reloader" Dec 03 14:34:45 crc kubenswrapper[4751]: E1203 14:34:45.452154 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="thanos-sidecar" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.452160 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="thanos-sidecar" Dec 03 14:34:45 crc kubenswrapper[4751]: E1203 14:34:45.452170 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fedaa81-0c36-44fa-ab7b-b712759fc8d4" containerName="swift-ring-rebalance" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.452176 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fedaa81-0c36-44fa-ab7b-b712759fc8d4" containerName="swift-ring-rebalance" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.456421 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fedaa81-0c36-44fa-ab7b-b712759fc8d4" containerName="swift-ring-rebalance" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.456474 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bd7e3e-0190-4154-96fc-4ca336745afd" containerName="ovn-config" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.456498 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="thanos-sidecar" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.456506 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="config-reloader" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.456521 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" containerName="prometheus" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.458406 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.463340 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8skrs" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.463551 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.463710 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.463849 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.463992 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.464119 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.483597 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.497914 4751 scope.go:117] "RemoveContainer" containerID="798a25e0fbe45250e4064ec82ae98343730da2aa5dc7e19451e9af976b8235f2" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536203 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536266 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b40b7285-42c6-4278-8d86-69847e549907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536314 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536387 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b40b7285-42c6-4278-8d86-69847e549907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536419 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536437 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-config\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536461 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b40b7285-42c6-4278-8d86-69847e549907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536487 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gc95\" (UniqueName: \"kubernetes.io/projected/b40b7285-42c6-4278-8d86-69847e549907-kube-api-access-2gc95\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536510 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536550 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.536621 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.569499 4751 scope.go:117] "RemoveContainer" containerID="a8f3c53b4b5ec7dd11766cf6c91eb7017eb00024f50e0bb96b916819e86aea84" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.594135 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.637861 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gc95\" (UniqueName: \"kubernetes.io/projected/b40b7285-42c6-4278-8d86-69847e549907-kube-api-access-2gc95\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.637930 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.637989 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.638084 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.638184 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.638215 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b40b7285-42c6-4278-8d86-69847e549907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.638248 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.638306 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b40b7285-42c6-4278-8d86-69847e549907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.638363 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.638393 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-config\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.638422 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b40b7285-42c6-4278-8d86-69847e549907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.639489 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b40b7285-42c6-4278-8d86-69847e549907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.658819 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.661751 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b40b7285-42c6-4278-8d86-69847e549907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.662652 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.663013 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-config\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.668462 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.669032 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b40b7285-42c6-4278-8d86-69847e549907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.669558 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.679968 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b40b7285-42c6-4278-8d86-69847e549907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.689950 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gc95\" (UniqueName: \"kubernetes.io/projected/b40b7285-42c6-4278-8d86-69847e549907-kube-api-access-2gc95\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.712029 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.712077 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cb164b736cb7b8b7ab8aad6339ced870201e734e0ebc8bfeab076a5d319160df/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.743096 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.813671 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 14:34:45 crc kubenswrapper[4751]: I1203 14:34:45.839107 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15f06bf8-4444-4992-ae0b-8d3229a2a6c3\") pod \"prometheus-metric-storage-0\" (UID: \"b40b7285-42c6-4278-8d86-69847e549907\") " pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:46 crc kubenswrapper[4751]: I1203 14:34:46.083681 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 14:34:46 crc kubenswrapper[4751]: I1203 14:34:46.349544 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"0f95089ce17e563e3c9b6cbfcd0a50c3a30cdde34a0767957ddd50185c6123b5"} Dec 03 14:34:46 crc kubenswrapper[4751]: I1203 14:34:46.361471 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hmq8c" event={"ID":"5ea81b69-95de-4772-b7bf-d48f52c298b1","Type":"ContainerStarted","Data":"e1746dd96f8c08d0da13592803ced15a8ff13a0de0f9381bab3eb3d05be54272"} Dec 03 14:34:46 crc kubenswrapper[4751]: I1203 14:34:46.392993 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hmq8c" podStartSLOduration=2.879756519 podStartE2EDuration="17.39297111s" podCreationTimestamp="2025-12-03 14:34:29 +0000 UTC" firstStartedPulling="2025-12-03 14:34:30.576916537 +0000 UTC m=+1277.565271764" lastFinishedPulling="2025-12-03 14:34:45.090131138 +0000 UTC m=+1292.078486355" observedRunningTime="2025-12-03 14:34:46.380882957 +0000 UTC m=+1293.369238174" watchObservedRunningTime="2025-12-03 14:34:46.39297111 +0000 UTC m=+1293.381326337" Dec 03 14:34:46 crc kubenswrapper[4751]: I1203 14:34:46.568430 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 14:34:47 crc kubenswrapper[4751]: I1203 14:34:47.354218 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de5857f-8fe8-48e3-991b-7171fc510567" path="/var/lib/kubelet/pods/9de5857f-8fe8-48e3-991b-7171fc510567/volumes" Dec 03 14:34:47 crc kubenswrapper[4751]: I1203 14:34:47.373636 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b40b7285-42c6-4278-8d86-69847e549907","Type":"ContainerStarted","Data":"b7ecda69dc9d2da7fef30c264391a548a212851f4cc63c3b985daa5361a38b18"} Dec 03 14:34:47 crc kubenswrapper[4751]: I1203 14:34:47.381280 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"d18fcc096c6ec4e9bf8d195de5406cc08c479f0ab936982ed3707ddf6d42e2bc"} Dec 03 14:34:47 crc kubenswrapper[4751]: I1203 14:34:47.381335 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"6a7f5bf97745fa79e4c3c62a307ca97bc806ad5b3bcb0a737ca9b9a56f80cd3c"} Dec 03 14:34:48 crc kubenswrapper[4751]: I1203 14:34:48.390032 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"677deb5b8225bee2440fc6037e1a2d7db5e784bd208256487c7ec0190802fb4b"} Dec 03 14:34:48 crc kubenswrapper[4751]: I1203 14:34:48.390539 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"615809f191bf6b427ff3df989417b1ca949056f1932a8650c2e9247b602af694"} Dec 03 14:34:49 crc kubenswrapper[4751]: I1203 14:34:49.398544 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b40b7285-42c6-4278-8d86-69847e549907","Type":"ContainerStarted","Data":"91f9c3839b6719e4426187f8eabef95088cde5c9602e6dd77586ef9ff9630a56"} Dec 03 14:34:49 crc kubenswrapper[4751]: I1203 14:34:49.403344 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"1ee610b4a48d37a525f92d19134703b32436358d40ad4259aca8415584975584"} Dec 03 14:34:50 crc kubenswrapper[4751]: I1203 14:34:50.415477 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"bb35d96dd390841269d33e1f3ef7a8104050e04b89d4854f287f6ec4b8e6f694"} Dec 03 14:34:50 crc kubenswrapper[4751]: I1203 14:34:50.415809 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"988213b752a66bae29f81b2dc909ddf19a48a4037667712e4a4217b3c165591b"} Dec 03 14:34:50 crc kubenswrapper[4751]: I1203 14:34:50.415831 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"f2cc0159fd4cff6fa468ab73a8f1412cccf63ed17c65569e0d7955ab0bce1137"} Dec 03 14:34:50 crc kubenswrapper[4751]: I1203 14:34:50.855534 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.151111 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-xtvzh"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.152753 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xtvzh" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.171985 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xtvzh"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.244927 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spd9l\" (UniqueName: \"kubernetes.io/projected/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-kube-api-access-spd9l\") pod \"barbican-db-create-xtvzh\" (UID: \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\") " pod="openstack/barbican-db-create-xtvzh" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.244996 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-operator-scripts\") pod \"barbican-db-create-xtvzh\" (UID: \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\") " pod="openstack/barbican-db-create-xtvzh" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.346148 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spd9l\" (UniqueName: \"kubernetes.io/projected/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-kube-api-access-spd9l\") pod \"barbican-db-create-xtvzh\" (UID: \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\") " pod="openstack/barbican-db-create-xtvzh" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.346217 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-operator-scripts\") pod \"barbican-db-create-xtvzh\" (UID: \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\") " pod="openstack/barbican-db-create-xtvzh" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.349211 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-operator-scripts\") pod \"barbican-db-create-xtvzh\" (UID: \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\") " pod="openstack/barbican-db-create-xtvzh" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.374815 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ea6b-account-create-update-rslqt"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.376166 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.378445 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.386350 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spd9l\" (UniqueName: \"kubernetes.io/projected/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-kube-api-access-spd9l\") pod \"barbican-db-create-xtvzh\" (UID: \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\") " pod="openstack/barbican-db-create-xtvzh" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.388917 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ea6b-account-create-update-rslqt"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.454018 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5j7m7"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.456026 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5j7m7" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.479206 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5j7m7"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.496753 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xtvzh" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.542474 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-cgd8h"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.543813 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.546341 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2bmt6" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.546600 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.546959 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.547232 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.554820 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-operator-scripts\") pod \"cinder-db-create-5j7m7\" (UID: \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\") " pod="openstack/cinder-db-create-5j7m7" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.554871 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5d2t\" (UniqueName: \"kubernetes.io/projected/dc533900-0d24-4473-83fe-7653f335a1a9-kube-api-access-j5d2t\") pod \"barbican-ea6b-account-create-update-rslqt\" (UID: \"dc533900-0d24-4473-83fe-7653f335a1a9\") " pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.554902 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc533900-0d24-4473-83fe-7653f335a1a9-operator-scripts\") pod \"barbican-ea6b-account-create-update-rslqt\" (UID: \"dc533900-0d24-4473-83fe-7653f335a1a9\") " pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.554939 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sdpj\" (UniqueName: \"kubernetes.io/projected/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-kube-api-access-8sdpj\") pod \"cinder-db-create-5j7m7\" (UID: \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\") " pod="openstack/cinder-db-create-5j7m7" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.566715 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cgd8h"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.644721 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-6n49f"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.645980 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.662124 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-config-data\") pod \"keystone-db-sync-cgd8h\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.662165 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-96fc-account-create-update-4nd9x"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.662180 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86kms\" (UniqueName: \"kubernetes.io/projected/124c2c16-935c-4bfc-9b53-11ffa90ed441-kube-api-access-86kms\") pod \"keystone-db-sync-cgd8h\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.662451 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-operator-scripts\") pod \"cinder-db-create-5j7m7\" (UID: \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\") " pod="openstack/cinder-db-create-5j7m7" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.662512 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5d2t\" (UniqueName: \"kubernetes.io/projected/dc533900-0d24-4473-83fe-7653f335a1a9-kube-api-access-j5d2t\") pod \"barbican-ea6b-account-create-update-rslqt\" (UID: \"dc533900-0d24-4473-83fe-7653f335a1a9\") " pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.662579 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc533900-0d24-4473-83fe-7653f335a1a9-operator-scripts\") pod \"barbican-ea6b-account-create-update-rslqt\" (UID: \"dc533900-0d24-4473-83fe-7653f335a1a9\") " pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.662626 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-combined-ca-bundle\") pod \"keystone-db-sync-cgd8h\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.662667 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sdpj\" (UniqueName: \"kubernetes.io/projected/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-kube-api-access-8sdpj\") pod \"cinder-db-create-5j7m7\" (UID: \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\") " pod="openstack/cinder-db-create-5j7m7" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.663461 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.663692 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-operator-scripts\") pod \"cinder-db-create-5j7m7\" (UID: \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\") " pod="openstack/cinder-db-create-5j7m7" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.664140 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc533900-0d24-4473-83fe-7653f335a1a9-operator-scripts\") pod \"barbican-ea6b-account-create-update-rslqt\" (UID: \"dc533900-0d24-4473-83fe-7653f335a1a9\") " pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.672620 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.679923 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-6n49f"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.695473 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-96fc-account-create-update-4nd9x"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.704225 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5d2t\" (UniqueName: \"kubernetes.io/projected/dc533900-0d24-4473-83fe-7653f335a1a9-kube-api-access-j5d2t\") pod \"barbican-ea6b-account-create-update-rslqt\" (UID: \"dc533900-0d24-4473-83fe-7653f335a1a9\") " pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.710140 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sdpj\" (UniqueName: \"kubernetes.io/projected/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-kube-api-access-8sdpj\") pod \"cinder-db-create-5j7m7\" (UID: \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\") " pod="openstack/cinder-db-create-5j7m7" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.759126 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.763929 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa3dba-8033-4b02-b75b-6052face2364-operator-scripts\") pod \"cloudkitty-db-create-6n49f\" (UID: \"a2fa3dba-8033-4b02-b75b-6052face2364\") " pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.764011 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-combined-ca-bundle\") pod \"keystone-db-sync-cgd8h\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.764019 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f6af-account-create-update-zqlks"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.764041 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea54721-c31e-48c1-93fd-ba0b9efb5189-operator-scripts\") pod \"cloudkitty-96fc-account-create-update-4nd9x\" (UID: \"cea54721-c31e-48c1-93fd-ba0b9efb5189\") " pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.764100 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swfnd\" (UniqueName: \"kubernetes.io/projected/a2fa3dba-8033-4b02-b75b-6052face2364-kube-api-access-swfnd\") pod \"cloudkitty-db-create-6n49f\" (UID: \"a2fa3dba-8033-4b02-b75b-6052face2364\") " pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.764159 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-config-data\") pod \"keystone-db-sync-cgd8h\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.764194 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86kms\" (UniqueName: \"kubernetes.io/projected/124c2c16-935c-4bfc-9b53-11ffa90ed441-kube-api-access-86kms\") pod \"keystone-db-sync-cgd8h\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.764214 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cpl\" (UniqueName: \"kubernetes.io/projected/cea54721-c31e-48c1-93fd-ba0b9efb5189-kube-api-access-85cpl\") pod \"cloudkitty-96fc-account-create-update-4nd9x\" (UID: \"cea54721-c31e-48c1-93fd-ba0b9efb5189\") " pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.765294 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.769014 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-combined-ca-bundle\") pod \"keystone-db-sync-cgd8h\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.770377 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.770844 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-config-data\") pod \"keystone-db-sync-cgd8h\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.775458 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5j7m7" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.785770 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f6af-account-create-update-zqlks"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.795470 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86kms\" (UniqueName: \"kubernetes.io/projected/124c2c16-935c-4bfc-9b53-11ffa90ed441-kube-api-access-86kms\") pod \"keystone-db-sync-cgd8h\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.866238 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.867017 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85cpl\" (UniqueName: \"kubernetes.io/projected/cea54721-c31e-48c1-93fd-ba0b9efb5189-kube-api-access-85cpl\") pod \"cloudkitty-96fc-account-create-update-4nd9x\" (UID: \"cea54721-c31e-48c1-93fd-ba0b9efb5189\") " pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.867090 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa3dba-8033-4b02-b75b-6052face2364-operator-scripts\") pod \"cloudkitty-db-create-6n49f\" (UID: \"a2fa3dba-8033-4b02-b75b-6052face2364\") " pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.867156 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea54721-c31e-48c1-93fd-ba0b9efb5189-operator-scripts\") pod \"cloudkitty-96fc-account-create-update-4nd9x\" (UID: \"cea54721-c31e-48c1-93fd-ba0b9efb5189\") " pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.867225 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swfnd\" (UniqueName: \"kubernetes.io/projected/a2fa3dba-8033-4b02-b75b-6052face2364-kube-api-access-swfnd\") pod \"cloudkitty-db-create-6n49f\" (UID: \"a2fa3dba-8033-4b02-b75b-6052face2364\") " pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.867258 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65bb3b2-f470-47b3-9031-35a6f9dbc930-operator-scripts\") pod \"cinder-f6af-account-create-update-zqlks\" (UID: \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\") " pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.867358 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlzsm\" (UniqueName: \"kubernetes.io/projected/a65bb3b2-f470-47b3-9031-35a6f9dbc930-kube-api-access-dlzsm\") pod \"cinder-f6af-account-create-update-zqlks\" (UID: \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\") " pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.872731 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea54721-c31e-48c1-93fd-ba0b9efb5189-operator-scripts\") pod \"cloudkitty-96fc-account-create-update-4nd9x\" (UID: \"cea54721-c31e-48c1-93fd-ba0b9efb5189\") " pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.881898 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa3dba-8033-4b02-b75b-6052face2364-operator-scripts\") pod \"cloudkitty-db-create-6n49f\" (UID: \"a2fa3dba-8033-4b02-b75b-6052face2364\") " pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.889072 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wbt2f"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.890576 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wbt2f" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.897624 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swfnd\" (UniqueName: \"kubernetes.io/projected/a2fa3dba-8033-4b02-b75b-6052face2364-kube-api-access-swfnd\") pod \"cloudkitty-db-create-6n49f\" (UID: \"a2fa3dba-8033-4b02-b75b-6052face2364\") " pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.903501 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bd1d-account-create-update-b5krp"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.903854 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cpl\" (UniqueName: \"kubernetes.io/projected/cea54721-c31e-48c1-93fd-ba0b9efb5189-kube-api-access-85cpl\") pod \"cloudkitty-96fc-account-create-update-4nd9x\" (UID: \"cea54721-c31e-48c1-93fd-ba0b9efb5189\") " pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.904652 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.913625 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.963807 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bd1d-account-create-update-b5krp"] Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.968792 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65bb3b2-f470-47b3-9031-35a6f9dbc930-operator-scripts\") pod \"cinder-f6af-account-create-update-zqlks\" (UID: \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\") " pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.969116 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlzsm\" (UniqueName: \"kubernetes.io/projected/a65bb3b2-f470-47b3-9031-35a6f9dbc930-kube-api-access-dlzsm\") pod \"cinder-f6af-account-create-update-zqlks\" (UID: \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\") " pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.969829 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65bb3b2-f470-47b3-9031-35a6f9dbc930-operator-scripts\") pod \"cinder-f6af-account-create-update-zqlks\" (UID: \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\") " pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:34:51 crc kubenswrapper[4751]: I1203 14:34:51.987253 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wbt2f"] Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.056569 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlzsm\" (UniqueName: \"kubernetes.io/projected/a65bb3b2-f470-47b3-9031-35a6f9dbc930-kube-api-access-dlzsm\") pod \"cinder-f6af-account-create-update-zqlks\" (UID: \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\") " pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.080923 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-operator-scripts\") pod \"neutron-db-create-wbt2f\" (UID: \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\") " pod="openstack/neutron-db-create-wbt2f" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.081039 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efcae2e5-8f91-40b7-842a-990dd6b13c66-operator-scripts\") pod \"neutron-bd1d-account-create-update-b5krp\" (UID: \"efcae2e5-8f91-40b7-842a-990dd6b13c66\") " pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.081065 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64vhm\" (UniqueName: \"kubernetes.io/projected/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-kube-api-access-64vhm\") pod \"neutron-db-create-wbt2f\" (UID: \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\") " pod="openstack/neutron-db-create-wbt2f" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.081144 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5q5t\" (UniqueName: \"kubernetes.io/projected/efcae2e5-8f91-40b7-842a-990dd6b13c66-kube-api-access-d5q5t\") pod \"neutron-bd1d-account-create-update-b5krp\" (UID: \"efcae2e5-8f91-40b7-842a-990dd6b13c66\") " pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.138564 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.164010 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.183084 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efcae2e5-8f91-40b7-842a-990dd6b13c66-operator-scripts\") pod \"neutron-bd1d-account-create-update-b5krp\" (UID: \"efcae2e5-8f91-40b7-842a-990dd6b13c66\") " pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.183773 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64vhm\" (UniqueName: \"kubernetes.io/projected/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-kube-api-access-64vhm\") pod \"neutron-db-create-wbt2f\" (UID: \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\") " pod="openstack/neutron-db-create-wbt2f" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.183844 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5q5t\" (UniqueName: \"kubernetes.io/projected/efcae2e5-8f91-40b7-842a-990dd6b13c66-kube-api-access-d5q5t\") pod \"neutron-bd1d-account-create-update-b5krp\" (UID: \"efcae2e5-8f91-40b7-842a-990dd6b13c66\") " pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.183901 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-operator-scripts\") pod \"neutron-db-create-wbt2f\" (UID: \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\") " pod="openstack/neutron-db-create-wbt2f" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.183740 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efcae2e5-8f91-40b7-842a-990dd6b13c66-operator-scripts\") pod \"neutron-bd1d-account-create-update-b5krp\" (UID: \"efcae2e5-8f91-40b7-842a-990dd6b13c66\") " pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.184450 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-operator-scripts\") pod \"neutron-db-create-wbt2f\" (UID: \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\") " pod="openstack/neutron-db-create-wbt2f" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.192304 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.204667 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64vhm\" (UniqueName: \"kubernetes.io/projected/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-kube-api-access-64vhm\") pod \"neutron-db-create-wbt2f\" (UID: \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\") " pod="openstack/neutron-db-create-wbt2f" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.204875 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5q5t\" (UniqueName: \"kubernetes.io/projected/efcae2e5-8f91-40b7-842a-990dd6b13c66-kube-api-access-d5q5t\") pod \"neutron-bd1d-account-create-update-b5krp\" (UID: \"efcae2e5-8f91-40b7-842a-990dd6b13c66\") " pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.231482 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wbt2f" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.246934 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:34:52 crc kubenswrapper[4751]: I1203 14:34:52.363649 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xtvzh"] Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:52.549390 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ea6b-account-create-update-rslqt"] Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:52.584019 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xtvzh" event={"ID":"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd","Type":"ContainerStarted","Data":"deaa70c97f204f6c2cf1d13fb83e6e7f2084f6109121058bd5afbb59025f140e"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:52.597160 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"ebbb73f5480dc6d5dc8ffe6c0ba0bd3153fd24eeb24c8ac03f501c2ddbe0aa94"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:52.610469 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5j7m7"] Dec 03 14:34:53 crc kubenswrapper[4751]: W1203 14:34:52.623114 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc533900_0d24_4473_83fe_7653f335a1a9.slice/crio-e8e09aeae28087bcca040b4d9a53b887f43982ea77db27ad98ff70e587d9f4a8 WatchSource:0}: Error finding container e8e09aeae28087bcca040b4d9a53b887f43982ea77db27ad98ff70e587d9f4a8: Status 404 returned error can't find the container with id e8e09aeae28087bcca040b4d9a53b887f43982ea77db27ad98ff70e587d9f4a8 Dec 03 14:34:53 crc kubenswrapper[4751]: W1203 14:34:52.641886 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3a39d7c_ae64_4afa_8426_5cc5d9fc78e0.slice/crio-0694a022c0429bfbbb0f23dd9de24ae943e12273760b90b44a2319342887cc4d WatchSource:0}: Error finding container 0694a022c0429bfbbb0f23dd9de24ae943e12273760b90b44a2319342887cc4d: Status 404 returned error can't find the container with id 0694a022c0429bfbbb0f23dd9de24ae943e12273760b90b44a2319342887cc4d Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.376420 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-96fc-account-create-update-4nd9x"] Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.708230 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"3ebde721b76fcbaf8410ef6e4cebc2669cb82a2a5df1f9583de3c44d092cc83b"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.708282 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"58c368569649bd38d0c0391f5650fc98a5b305353f1f4bc70b23e977a269e110"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.708294 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"7ca00d443ee00c64d69645abb87fe984d10ad4b9651c541efa85b40a2db480b2"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.709878 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5j7m7" event={"ID":"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0","Type":"ContainerStarted","Data":"dcf86151c91f3377fdb3af738f8ad26c4ce523b3540c2fd36e1ac37462e19631"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.709912 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5j7m7" event={"ID":"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0","Type":"ContainerStarted","Data":"0694a022c0429bfbbb0f23dd9de24ae943e12273760b90b44a2319342887cc4d"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.722623 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" event={"ID":"cea54721-c31e-48c1-93fd-ba0b9efb5189","Type":"ContainerStarted","Data":"8a2c0e3dd5ba3d266c7d3b917e2e35d1275ed7a62d913f25c59428938b721512"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.739635 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ea6b-account-create-update-rslqt" event={"ID":"dc533900-0d24-4473-83fe-7653f335a1a9","Type":"ContainerStarted","Data":"943207f8bee76a544dfcefd1215a3a958b2fcf485c82c9fc006f4e2c19b307f3"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.739674 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ea6b-account-create-update-rslqt" event={"ID":"dc533900-0d24-4473-83fe-7653f335a1a9","Type":"ContainerStarted","Data":"e8e09aeae28087bcca040b4d9a53b887f43982ea77db27ad98ff70e587d9f4a8"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.748780 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xtvzh" event={"ID":"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd","Type":"ContainerStarted","Data":"c78f2a2a892dfa058d90620129ac2102adcab0af7f9a041586dd6dd61088d9cf"} Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.758644 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-5j7m7" podStartSLOduration=2.758624648 podStartE2EDuration="2.758624648s" podCreationTimestamp="2025-12-03 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:53.740967457 +0000 UTC m=+1300.729322674" watchObservedRunningTime="2025-12-03 14:34:53.758624648 +0000 UTC m=+1300.746979865" Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.780604 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-ea6b-account-create-update-rslqt" podStartSLOduration=2.780586887 podStartE2EDuration="2.780586887s" podCreationTimestamp="2025-12-03 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:53.776077014 +0000 UTC m=+1300.764432241" watchObservedRunningTime="2025-12-03 14:34:53.780586887 +0000 UTC m=+1300.768942094" Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.805168 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-xtvzh" podStartSLOduration=2.805148157 podStartE2EDuration="2.805148157s" podCreationTimestamp="2025-12-03 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:53.79793327 +0000 UTC m=+1300.786288497" watchObservedRunningTime="2025-12-03 14:34:53.805148157 +0000 UTC m=+1300.793503374" Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.941245 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-6n49f"] Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.980750 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f6af-account-create-update-zqlks"] Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.988749 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bd1d-account-create-update-b5krp"] Dec 03 14:34:53 crc kubenswrapper[4751]: I1203 14:34:53.997063 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cgd8h"] Dec 03 14:34:54 crc kubenswrapper[4751]: I1203 14:34:54.005320 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wbt2f"] Dec 03 14:34:54 crc kubenswrapper[4751]: W1203 14:34:54.023908 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod124c2c16_935c_4bfc_9b53_11ffa90ed441.slice/crio-bbfb1d71b56439532a615fdf59e0894753a41b88606721b65d839a12d6782de1 WatchSource:0}: Error finding container bbfb1d71b56439532a615fdf59e0894753a41b88606721b65d839a12d6782de1: Status 404 returned error can't find the container with id bbfb1d71b56439532a615fdf59e0894753a41b88606721b65d839a12d6782de1 Dec 03 14:34:54 crc kubenswrapper[4751]: I1203 14:34:54.762930 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6n49f" event={"ID":"a2fa3dba-8033-4b02-b75b-6052face2364","Type":"ContainerStarted","Data":"626687ab689ee9adb3dd48274f4bca4796520f94b82e6c6e3f58fd9f861a0f2f"} Dec 03 14:34:54 crc kubenswrapper[4751]: I1203 14:34:54.764650 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wbt2f" event={"ID":"0ce7be68-ea3a-4d2f-b875-9db739a50a8e","Type":"ContainerStarted","Data":"cf34788ed39383d962796a59a8be7152c12e3ef880d73ab6cc9453a7e3a8c1f1"} Dec 03 14:34:54 crc kubenswrapper[4751]: I1203 14:34:54.765871 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f6af-account-create-update-zqlks" event={"ID":"a65bb3b2-f470-47b3-9031-35a6f9dbc930","Type":"ContainerStarted","Data":"e112bc7241acb7964c4db561f966ea63fe5768bedb6d08a0019e52dcb41681d3"} Dec 03 14:34:54 crc kubenswrapper[4751]: I1203 14:34:54.766889 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd1d-account-create-update-b5krp" event={"ID":"efcae2e5-8f91-40b7-842a-990dd6b13c66","Type":"ContainerStarted","Data":"ee76597249467c972b8962d03da8deea99c5620f2bb64d9ec80a362712f800bb"} Dec 03 14:34:54 crc kubenswrapper[4751]: I1203 14:34:54.767933 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cgd8h" event={"ID":"124c2c16-935c-4bfc-9b53-11ffa90ed441","Type":"ContainerStarted","Data":"bbfb1d71b56439532a615fdf59e0894753a41b88606721b65d839a12d6782de1"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.782774 4751 generic.go:334] "Generic (PLEG): container finished" podID="c838d53c-d56e-4cfc-a15a-fb91dccb5dbd" containerID="c78f2a2a892dfa058d90620129ac2102adcab0af7f9a041586dd6dd61088d9cf" exitCode=0 Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.783652 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xtvzh" event={"ID":"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd","Type":"ContainerDied","Data":"c78f2a2a892dfa058d90620129ac2102adcab0af7f9a041586dd6dd61088d9cf"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.790136 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wbt2f" event={"ID":"0ce7be68-ea3a-4d2f-b875-9db739a50a8e","Type":"ContainerStarted","Data":"a21e3be16085a36b9d730c18dbbb117c4a4510686de75a90f1c97f8617afea25"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.823951 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd1d-account-create-update-b5krp" event={"ID":"efcae2e5-8f91-40b7-842a-990dd6b13c66","Type":"ContainerStarted","Data":"71909259f7b71b1781f3d3ddb0389c5602b4634d4958fbae473cb1605e015204"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.837717 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-wbt2f" podStartSLOduration=4.837684557 podStartE2EDuration="4.837684557s" podCreationTimestamp="2025-12-03 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:55.8333963 +0000 UTC m=+1302.821751527" watchObservedRunningTime="2025-12-03 14:34:55.837684557 +0000 UTC m=+1302.826039784" Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.849999 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"73c069f6ca86688ab290c9d967e329bee0a7ac33885b18892e9a06d0e5b09ddb"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.852531 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6n49f" event={"ID":"a2fa3dba-8033-4b02-b75b-6052face2364","Type":"ContainerStarted","Data":"a611f162d339b901e11e0e03437effbbbe9ed2dce020b7a47ad35124ab7aa5b7"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.854620 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f6af-account-create-update-zqlks" event={"ID":"a65bb3b2-f470-47b3-9031-35a6f9dbc930","Type":"ContainerStarted","Data":"78157396ad3252c0c758b997916424c7bd61d4dfa3fd1d188ba326ec5a4965ec"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.865844 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bd1d-account-create-update-b5krp" podStartSLOduration=4.865815853 podStartE2EDuration="4.865815853s" podCreationTimestamp="2025-12-03 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:55.850454505 +0000 UTC m=+1302.838809722" watchObservedRunningTime="2025-12-03 14:34:55.865815853 +0000 UTC m=+1302.854171070" Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.872283 4751 generic.go:334] "Generic (PLEG): container finished" podID="a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0" containerID="dcf86151c91f3377fdb3af738f8ad26c4ce523b3540c2fd36e1ac37462e19631" exitCode=0 Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.872612 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5j7m7" event={"ID":"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0","Type":"ContainerDied","Data":"dcf86151c91f3377fdb3af738f8ad26c4ce523b3540c2fd36e1ac37462e19631"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.874704 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-create-6n49f" podStartSLOduration=4.874682335 podStartE2EDuration="4.874682335s" podCreationTimestamp="2025-12-03 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:55.86789101 +0000 UTC m=+1302.856246227" watchObservedRunningTime="2025-12-03 14:34:55.874682335 +0000 UTC m=+1302.863037552" Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.876176 4751 generic.go:334] "Generic (PLEG): container finished" podID="b40b7285-42c6-4278-8d86-69847e549907" containerID="91f9c3839b6719e4426187f8eabef95088cde5c9602e6dd77586ef9ff9630a56" exitCode=0 Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.876231 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b40b7285-42c6-4278-8d86-69847e549907","Type":"ContainerDied","Data":"91f9c3839b6719e4426187f8eabef95088cde5c9602e6dd77586ef9ff9630a56"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.889214 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" event={"ID":"cea54721-c31e-48c1-93fd-ba0b9efb5189","Type":"ContainerStarted","Data":"f7bc41aba9ec034f2c6b6385ae9eeb823d3d608952ed3bab1df3abce42b7c97a"} Dec 03 14:34:55 crc kubenswrapper[4751]: I1203 14:34:55.891572 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-f6af-account-create-update-zqlks" podStartSLOduration=4.891551635 podStartE2EDuration="4.891551635s" podCreationTimestamp="2025-12-03 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:34:55.884893154 +0000 UTC m=+1302.873248371" watchObservedRunningTime="2025-12-03 14:34:55.891551635 +0000 UTC m=+1302.879906862" Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.899659 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b40b7285-42c6-4278-8d86-69847e549907","Type":"ContainerStarted","Data":"f38f78270fe29b277531b28b4ba8d29de794b6b97f2810a733e3e23cc9177ec2"} Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.901540 4751 generic.go:334] "Generic (PLEG): container finished" podID="cea54721-c31e-48c1-93fd-ba0b9efb5189" containerID="f7bc41aba9ec034f2c6b6385ae9eeb823d3d608952ed3bab1df3abce42b7c97a" exitCode=0 Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.901611 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" event={"ID":"cea54721-c31e-48c1-93fd-ba0b9efb5189","Type":"ContainerDied","Data":"f7bc41aba9ec034f2c6b6385ae9eeb823d3d608952ed3bab1df3abce42b7c97a"} Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.903830 4751 generic.go:334] "Generic (PLEG): container finished" podID="dc533900-0d24-4473-83fe-7653f335a1a9" containerID="943207f8bee76a544dfcefd1215a3a958b2fcf485c82c9fc006f4e2c19b307f3" exitCode=0 Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.903873 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ea6b-account-create-update-rslqt" event={"ID":"dc533900-0d24-4473-83fe-7653f335a1a9","Type":"ContainerDied","Data":"943207f8bee76a544dfcefd1215a3a958b2fcf485c82c9fc006f4e2c19b307f3"} Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.906943 4751 generic.go:334] "Generic (PLEG): container finished" podID="a2fa3dba-8033-4b02-b75b-6052face2364" containerID="a611f162d339b901e11e0e03437effbbbe9ed2dce020b7a47ad35124ab7aa5b7" exitCode=0 Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.906974 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6n49f" event={"ID":"a2fa3dba-8033-4b02-b75b-6052face2364","Type":"ContainerDied","Data":"a611f162d339b901e11e0e03437effbbbe9ed2dce020b7a47ad35124ab7aa5b7"} Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.908871 4751 generic.go:334] "Generic (PLEG): container finished" podID="0ce7be68-ea3a-4d2f-b875-9db739a50a8e" containerID="a21e3be16085a36b9d730c18dbbb117c4a4510686de75a90f1c97f8617afea25" exitCode=0 Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.908920 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wbt2f" event={"ID":"0ce7be68-ea3a-4d2f-b875-9db739a50a8e","Type":"ContainerDied","Data":"a21e3be16085a36b9d730c18dbbb117c4a4510686de75a90f1c97f8617afea25"} Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.910210 4751 generic.go:334] "Generic (PLEG): container finished" podID="a65bb3b2-f470-47b3-9031-35a6f9dbc930" containerID="78157396ad3252c0c758b997916424c7bd61d4dfa3fd1d188ba326ec5a4965ec" exitCode=0 Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.910258 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f6af-account-create-update-zqlks" event={"ID":"a65bb3b2-f470-47b3-9031-35a6f9dbc930","Type":"ContainerDied","Data":"78157396ad3252c0c758b997916424c7bd61d4dfa3fd1d188ba326ec5a4965ec"} Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.911963 4751 generic.go:334] "Generic (PLEG): container finished" podID="efcae2e5-8f91-40b7-842a-990dd6b13c66" containerID="71909259f7b71b1781f3d3ddb0389c5602b4634d4958fbae473cb1605e015204" exitCode=0 Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.912003 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd1d-account-create-update-b5krp" event={"ID":"efcae2e5-8f91-40b7-842a-990dd6b13c66","Type":"ContainerDied","Data":"71909259f7b71b1781f3d3ddb0389c5602b4634d4958fbae473cb1605e015204"} Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.923289 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"38a2e270de4c6a469a3ee90b1ba5a6168d8f430053d56b911e1cb3199be6effa"} Dec 03 14:34:56 crc kubenswrapper[4751]: I1203 14:34:56.923339 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2","Type":"ContainerStarted","Data":"3570d60bcfca784997d2a28f049bdd8e68f94f1fc3f66516c3cf6d6364319cea"} Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.054816 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=45.2476862 podStartE2EDuration="51.054788871s" podCreationTimestamp="2025-12-03 14:34:06 +0000 UTC" firstStartedPulling="2025-12-03 14:34:45.879530811 +0000 UTC m=+1292.867886028" lastFinishedPulling="2025-12-03 14:34:51.686633482 +0000 UTC m=+1298.674988699" observedRunningTime="2025-12-03 14:34:57.042874607 +0000 UTC m=+1304.031229824" watchObservedRunningTime="2025-12-03 14:34:57.054788871 +0000 UTC m=+1304.043144088" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.422038 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xk8jg"] Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.424102 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.430375 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.444930 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xk8jg"] Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.560114 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j869f\" (UniqueName: \"kubernetes.io/projected/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-kube-api-access-j869f\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.560177 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.560219 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.560239 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.560277 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-config\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.560367 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.662613 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-config\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.662754 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.662810 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j869f\" (UniqueName: \"kubernetes.io/projected/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-kube-api-access-j869f\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.662862 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.662906 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.662931 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.664564 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-config\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.664689 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.665667 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.665764 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.665799 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.685715 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j869f\" (UniqueName: \"kubernetes.io/projected/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-kube-api-access-j869f\") pod \"dnsmasq-dns-5c79d794d7-xk8jg\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.759408 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.935206 4751 generic.go:334] "Generic (PLEG): container finished" podID="5ea81b69-95de-4772-b7bf-d48f52c298b1" containerID="e1746dd96f8c08d0da13592803ced15a8ff13a0de0f9381bab3eb3d05be54272" exitCode=0 Dec 03 14:34:57 crc kubenswrapper[4751]: I1203 14:34:57.935316 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hmq8c" event={"ID":"5ea81b69-95de-4772-b7bf-d48f52c298b1","Type":"ContainerDied","Data":"e1746dd96f8c08d0da13592803ced15a8ff13a0de0f9381bab3eb3d05be54272"} Dec 03 14:34:59 crc kubenswrapper[4751]: I1203 14:34:59.956024 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b40b7285-42c6-4278-8d86-69847e549907","Type":"ContainerStarted","Data":"5627e2dcbbe09b05761cb28bd2d1751e8d23f0c4dfe6554b9a339f87b5935d6c"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.382452 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.408546 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.430089 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xtvzh" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.444835 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5j7m7" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.452355 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wbt2f" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.463408 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.464047 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5d2t\" (UniqueName: \"kubernetes.io/projected/dc533900-0d24-4473-83fe-7653f335a1a9-kube-api-access-j5d2t\") pod \"dc533900-0d24-4473-83fe-7653f335a1a9\" (UID: \"dc533900-0d24-4473-83fe-7653f335a1a9\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.464200 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85cpl\" (UniqueName: \"kubernetes.io/projected/cea54721-c31e-48c1-93fd-ba0b9efb5189-kube-api-access-85cpl\") pod \"cea54721-c31e-48c1-93fd-ba0b9efb5189\" (UID: \"cea54721-c31e-48c1-93fd-ba0b9efb5189\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.464251 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea54721-c31e-48c1-93fd-ba0b9efb5189-operator-scripts\") pod \"cea54721-c31e-48c1-93fd-ba0b9efb5189\" (UID: \"cea54721-c31e-48c1-93fd-ba0b9efb5189\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.464338 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc533900-0d24-4473-83fe-7653f335a1a9-operator-scripts\") pod \"dc533900-0d24-4473-83fe-7653f335a1a9\" (UID: \"dc533900-0d24-4473-83fe-7653f335a1a9\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.465348 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc533900-0d24-4473-83fe-7653f335a1a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc533900-0d24-4473-83fe-7653f335a1a9" (UID: "dc533900-0d24-4473-83fe-7653f335a1a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.465694 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cea54721-c31e-48c1-93fd-ba0b9efb5189-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cea54721-c31e-48c1-93fd-ba0b9efb5189" (UID: "cea54721-c31e-48c1-93fd-ba0b9efb5189"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.476206 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc533900-0d24-4473-83fe-7653f335a1a9-kube-api-access-j5d2t" (OuterVolumeSpecName: "kube-api-access-j5d2t") pod "dc533900-0d24-4473-83fe-7653f335a1a9" (UID: "dc533900-0d24-4473-83fe-7653f335a1a9"). InnerVolumeSpecName "kube-api-access-j5d2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.476621 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea54721-c31e-48c1-93fd-ba0b9efb5189-kube-api-access-85cpl" (OuterVolumeSpecName: "kube-api-access-85cpl") pod "cea54721-c31e-48c1-93fd-ba0b9efb5189" (UID: "cea54721-c31e-48c1-93fd-ba0b9efb5189"). InnerVolumeSpecName "kube-api-access-85cpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.501740 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.562718 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hmq8c" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.563369 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567024 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sdpj\" (UniqueName: \"kubernetes.io/projected/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-kube-api-access-8sdpj\") pod \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\" (UID: \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567129 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-operator-scripts\") pod \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\" (UID: \"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567164 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlzsm\" (UniqueName: \"kubernetes.io/projected/a65bb3b2-f470-47b3-9031-35a6f9dbc930-kube-api-access-dlzsm\") pod \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\" (UID: \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567260 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-operator-scripts\") pod \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\" (UID: \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567286 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swfnd\" (UniqueName: \"kubernetes.io/projected/a2fa3dba-8033-4b02-b75b-6052face2364-kube-api-access-swfnd\") pod \"a2fa3dba-8033-4b02-b75b-6052face2364\" (UID: \"a2fa3dba-8033-4b02-b75b-6052face2364\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567607 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa3dba-8033-4b02-b75b-6052face2364-operator-scripts\") pod \"a2fa3dba-8033-4b02-b75b-6052face2364\" (UID: \"a2fa3dba-8033-4b02-b75b-6052face2364\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567682 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-operator-scripts\") pod \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\" (UID: \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567684 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0" (UID: "a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567731 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64vhm\" (UniqueName: \"kubernetes.io/projected/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-kube-api-access-64vhm\") pod \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\" (UID: \"0ce7be68-ea3a-4d2f-b875-9db739a50a8e\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567798 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spd9l\" (UniqueName: \"kubernetes.io/projected/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-kube-api-access-spd9l\") pod \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\" (UID: \"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.567835 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65bb3b2-f470-47b3-9031-35a6f9dbc930-operator-scripts\") pod \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\" (UID: \"a65bb3b2-f470-47b3-9031-35a6f9dbc930\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.568004 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2fa3dba-8033-4b02-b75b-6052face2364-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2fa3dba-8033-4b02-b75b-6052face2364" (UID: "a2fa3dba-8033-4b02-b75b-6052face2364"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.568119 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ce7be68-ea3a-4d2f-b875-9db739a50a8e" (UID: "0ce7be68-ea3a-4d2f-b875-9db739a50a8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.568688 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5d2t\" (UniqueName: \"kubernetes.io/projected/dc533900-0d24-4473-83fe-7653f335a1a9-kube-api-access-j5d2t\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.568707 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.568720 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa3dba-8033-4b02-b75b-6052face2364-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.568729 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85cpl\" (UniqueName: \"kubernetes.io/projected/cea54721-c31e-48c1-93fd-ba0b9efb5189-kube-api-access-85cpl\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.568738 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea54721-c31e-48c1-93fd-ba0b9efb5189-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.568747 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc533900-0d24-4473-83fe-7653f335a1a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.568758 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.569074 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c838d53c-d56e-4cfc-a15a-fb91dccb5dbd" (UID: "c838d53c-d56e-4cfc-a15a-fb91dccb5dbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.569352 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a65bb3b2-f470-47b3-9031-35a6f9dbc930-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a65bb3b2-f470-47b3-9031-35a6f9dbc930" (UID: "a65bb3b2-f470-47b3-9031-35a6f9dbc930"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.571256 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-kube-api-access-spd9l" (OuterVolumeSpecName: "kube-api-access-spd9l") pod "c838d53c-d56e-4cfc-a15a-fb91dccb5dbd" (UID: "c838d53c-d56e-4cfc-a15a-fb91dccb5dbd"). InnerVolumeSpecName "kube-api-access-spd9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.574150 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-kube-api-access-8sdpj" (OuterVolumeSpecName: "kube-api-access-8sdpj") pod "a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0" (UID: "a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0"). InnerVolumeSpecName "kube-api-access-8sdpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.574205 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-kube-api-access-64vhm" (OuterVolumeSpecName: "kube-api-access-64vhm") pod "0ce7be68-ea3a-4d2f-b875-9db739a50a8e" (UID: "0ce7be68-ea3a-4d2f-b875-9db739a50a8e"). InnerVolumeSpecName "kube-api-access-64vhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.587509 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65bb3b2-f470-47b3-9031-35a6f9dbc930-kube-api-access-dlzsm" (OuterVolumeSpecName: "kube-api-access-dlzsm") pod "a65bb3b2-f470-47b3-9031-35a6f9dbc930" (UID: "a65bb3b2-f470-47b3-9031-35a6f9dbc930"). InnerVolumeSpecName "kube-api-access-dlzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.622530 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2fa3dba-8033-4b02-b75b-6052face2364-kube-api-access-swfnd" (OuterVolumeSpecName: "kube-api-access-swfnd") pod "a2fa3dba-8033-4b02-b75b-6052face2364" (UID: "a2fa3dba-8033-4b02-b75b-6052face2364"). InnerVolumeSpecName "kube-api-access-swfnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.669513 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-combined-ca-bundle\") pod \"5ea81b69-95de-4772-b7bf-d48f52c298b1\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.669595 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mbrx\" (UniqueName: \"kubernetes.io/projected/5ea81b69-95de-4772-b7bf-d48f52c298b1-kube-api-access-7mbrx\") pod \"5ea81b69-95de-4772-b7bf-d48f52c298b1\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.669681 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-db-sync-config-data\") pod \"5ea81b69-95de-4772-b7bf-d48f52c298b1\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.669721 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5q5t\" (UniqueName: \"kubernetes.io/projected/efcae2e5-8f91-40b7-842a-990dd6b13c66-kube-api-access-d5q5t\") pod \"efcae2e5-8f91-40b7-842a-990dd6b13c66\" (UID: \"efcae2e5-8f91-40b7-842a-990dd6b13c66\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.669776 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-config-data\") pod \"5ea81b69-95de-4772-b7bf-d48f52c298b1\" (UID: \"5ea81b69-95de-4772-b7bf-d48f52c298b1\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.669810 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efcae2e5-8f91-40b7-842a-990dd6b13c66-operator-scripts\") pod \"efcae2e5-8f91-40b7-842a-990dd6b13c66\" (UID: \"efcae2e5-8f91-40b7-842a-990dd6b13c66\") " Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.670256 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swfnd\" (UniqueName: \"kubernetes.io/projected/a2fa3dba-8033-4b02-b75b-6052face2364-kube-api-access-swfnd\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.670291 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.670300 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64vhm\" (UniqueName: \"kubernetes.io/projected/0ce7be68-ea3a-4d2f-b875-9db739a50a8e-kube-api-access-64vhm\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.670309 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spd9l\" (UniqueName: \"kubernetes.io/projected/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd-kube-api-access-spd9l\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.670318 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a65bb3b2-f470-47b3-9031-35a6f9dbc930-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.670340 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sdpj\" (UniqueName: \"kubernetes.io/projected/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0-kube-api-access-8sdpj\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.670349 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlzsm\" (UniqueName: \"kubernetes.io/projected/a65bb3b2-f470-47b3-9031-35a6f9dbc930-kube-api-access-dlzsm\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.670697 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcae2e5-8f91-40b7-842a-990dd6b13c66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efcae2e5-8f91-40b7-842a-990dd6b13c66" (UID: "efcae2e5-8f91-40b7-842a-990dd6b13c66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.675092 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efcae2e5-8f91-40b7-842a-990dd6b13c66-kube-api-access-d5q5t" (OuterVolumeSpecName: "kube-api-access-d5q5t") pod "efcae2e5-8f91-40b7-842a-990dd6b13c66" (UID: "efcae2e5-8f91-40b7-842a-990dd6b13c66"). InnerVolumeSpecName "kube-api-access-d5q5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.675090 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5ea81b69-95de-4772-b7bf-d48f52c298b1" (UID: "5ea81b69-95de-4772-b7bf-d48f52c298b1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.675700 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xk8jg"] Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.679936 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ea81b69-95de-4772-b7bf-d48f52c298b1-kube-api-access-7mbrx" (OuterVolumeSpecName: "kube-api-access-7mbrx") pod "5ea81b69-95de-4772-b7bf-d48f52c298b1" (UID: "5ea81b69-95de-4772-b7bf-d48f52c298b1"). InnerVolumeSpecName "kube-api-access-7mbrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.700794 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ea81b69-95de-4772-b7bf-d48f52c298b1" (UID: "5ea81b69-95de-4772-b7bf-d48f52c298b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.735706 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-config-data" (OuterVolumeSpecName: "config-data") pod "5ea81b69-95de-4772-b7bf-d48f52c298b1" (UID: "5ea81b69-95de-4772-b7bf-d48f52c298b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.772511 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.772552 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efcae2e5-8f91-40b7-842a-990dd6b13c66-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.772569 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.772581 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mbrx\" (UniqueName: \"kubernetes.io/projected/5ea81b69-95de-4772-b7bf-d48f52c298b1-kube-api-access-7mbrx\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.772591 4751 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ea81b69-95de-4772-b7bf-d48f52c298b1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.772605 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5q5t\" (UniqueName: \"kubernetes.io/projected/efcae2e5-8f91-40b7-842a-990dd6b13c66-kube-api-access-d5q5t\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.969223 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b40b7285-42c6-4278-8d86-69847e549907","Type":"ContainerStarted","Data":"0a3acb9f4649e8f2aaa0eac7238876d54e00884c1f682337bf621c24299a4389"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.970383 4751 generic.go:334] "Generic (PLEG): container finished" podID="ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" containerID="cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5" exitCode=0 Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.970476 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" event={"ID":"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30","Type":"ContainerDied","Data":"cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.970547 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" event={"ID":"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30","Type":"ContainerStarted","Data":"583df8103bff55c7875e06aeb240357324b006e34931e49555eed9da3704e8e1"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.972857 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hmq8c" event={"ID":"5ea81b69-95de-4772-b7bf-d48f52c298b1","Type":"ContainerDied","Data":"663c56efe52dcafd7e7c5c86c7359b14d7a213396e7c36f94fe5e5256e49e805"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.972897 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="663c56efe52dcafd7e7c5c86c7359b14d7a213396e7c36f94fe5e5256e49e805" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.972874 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hmq8c" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.974980 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xtvzh" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.974983 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xtvzh" event={"ID":"c838d53c-d56e-4cfc-a15a-fb91dccb5dbd","Type":"ContainerDied","Data":"deaa70c97f204f6c2cf1d13fb83e6e7f2084f6109121058bd5afbb59025f140e"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.975120 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deaa70c97f204f6c2cf1d13fb83e6e7f2084f6109121058bd5afbb59025f140e" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.977048 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f6af-account-create-update-zqlks" event={"ID":"a65bb3b2-f470-47b3-9031-35a6f9dbc930","Type":"ContainerDied","Data":"e112bc7241acb7964c4db561f966ea63fe5768bedb6d08a0019e52dcb41681d3"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.977096 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e112bc7241acb7964c4db561f966ea63fe5768bedb6d08a0019e52dcb41681d3" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.977121 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f6af-account-create-update-zqlks" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.978448 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd1d-account-create-update-b5krp" event={"ID":"efcae2e5-8f91-40b7-842a-990dd6b13c66","Type":"ContainerDied","Data":"ee76597249467c972b8962d03da8deea99c5620f2bb64d9ec80a362712f800bb"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.978579 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd1d-account-create-update-b5krp" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.978662 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee76597249467c972b8962d03da8deea99c5620f2bb64d9ec80a362712f800bb" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.979818 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5j7m7" event={"ID":"a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0","Type":"ContainerDied","Data":"0694a022c0429bfbbb0f23dd9de24ae943e12273760b90b44a2319342887cc4d"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.979853 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0694a022c0429bfbbb0f23dd9de24ae943e12273760b90b44a2319342887cc4d" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.979830 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5j7m7" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.981161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" event={"ID":"cea54721-c31e-48c1-93fd-ba0b9efb5189","Type":"ContainerDied","Data":"8a2c0e3dd5ba3d266c7d3b917e2e35d1275ed7a62d913f25c59428938b721512"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.981193 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a2c0e3dd5ba3d266c7d3b917e2e35d1275ed7a62d913f25c59428938b721512" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.981165 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-96fc-account-create-update-4nd9x" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.982452 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ea6b-account-create-update-rslqt" event={"ID":"dc533900-0d24-4473-83fe-7653f335a1a9","Type":"ContainerDied","Data":"e8e09aeae28087bcca040b4d9a53b887f43982ea77db27ad98ff70e587d9f4a8"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.982487 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8e09aeae28087bcca040b4d9a53b887f43982ea77db27ad98ff70e587d9f4a8" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.982545 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ea6b-account-create-update-rslqt" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.985217 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6n49f" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.985214 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6n49f" event={"ID":"a2fa3dba-8033-4b02-b75b-6052face2364","Type":"ContainerDied","Data":"626687ab689ee9adb3dd48274f4bca4796520f94b82e6c6e3f58fd9f861a0f2f"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.985449 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="626687ab689ee9adb3dd48274f4bca4796520f94b82e6c6e3f58fd9f861a0f2f" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.986567 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wbt2f" event={"ID":"0ce7be68-ea3a-4d2f-b875-9db739a50a8e","Type":"ContainerDied","Data":"cf34788ed39383d962796a59a8be7152c12e3ef880d73ab6cc9453a7e3a8c1f1"} Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.986594 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wbt2f" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.986605 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf34788ed39383d962796a59a8be7152c12e3ef880d73ab6cc9453a7e3a8c1f1" Dec 03 14:35:00 crc kubenswrapper[4751]: I1203 14:35:00.988122 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cgd8h" event={"ID":"124c2c16-935c-4bfc-9b53-11ffa90ed441","Type":"ContainerStarted","Data":"38ad4756d3448098303991c1251eca0f74cefb44f1e69fbf6ae5e87305ba54d1"} Dec 03 14:35:01 crc kubenswrapper[4751]: I1203 14:35:01.028719 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.028696648 podStartE2EDuration="16.028696648s" podCreationTimestamp="2025-12-03 14:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:01.005923957 +0000 UTC m=+1307.994279194" watchObservedRunningTime="2025-12-03 14:35:01.028696648 +0000 UTC m=+1308.017051885" Dec 03 14:35:01 crc kubenswrapper[4751]: I1203 14:35:01.055426 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-cgd8h" podStartSLOduration=3.848574508 podStartE2EDuration="10.055388116s" podCreationTimestamp="2025-12-03 14:34:51 +0000 UTC" firstStartedPulling="2025-12-03 14:34:54.025617986 +0000 UTC m=+1301.013973203" lastFinishedPulling="2025-12-03 14:35:00.232431584 +0000 UTC m=+1307.220786811" observedRunningTime="2025-12-03 14:35:01.025138071 +0000 UTC m=+1308.013493298" watchObservedRunningTime="2025-12-03 14:35:01.055388116 +0000 UTC m=+1308.043743333" Dec 03 14:35:01 crc kubenswrapper[4751]: I1203 14:35:01.084603 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 14:35:01 crc kubenswrapper[4751]: I1203 14:35:01.084952 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 14:35:01 crc kubenswrapper[4751]: I1203 14:35:01.091750 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.005697 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" event={"ID":"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30","Type":"ContainerStarted","Data":"cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317"} Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.006178 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.010725 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.020307 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xk8jg"] Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.035846 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" podStartSLOduration=5.035823699 podStartE2EDuration="5.035823699s" podCreationTimestamp="2025-12-03 14:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:02.03439646 +0000 UTC m=+1309.022751687" watchObservedRunningTime="2025-12-03 14:35:02.035823699 +0000 UTC m=+1309.024178916" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.067640 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-p8nj5"] Dec 03 14:35:02 crc kubenswrapper[4751]: E1203 14:35:02.068604 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c838d53c-d56e-4cfc-a15a-fb91dccb5dbd" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.068823 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c838d53c-d56e-4cfc-a15a-fb91dccb5dbd" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: E1203 14:35:02.097572 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65bb3b2-f470-47b3-9031-35a6f9dbc930" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.097794 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65bb3b2-f470-47b3-9031-35a6f9dbc930" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: E1203 14:35:02.097866 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea54721-c31e-48c1-93fd-ba0b9efb5189" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.097923 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea54721-c31e-48c1-93fd-ba0b9efb5189" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: E1203 14:35:02.097985 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.098035 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: E1203 14:35:02.098090 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea81b69-95de-4772-b7bf-d48f52c298b1" containerName="glance-db-sync" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.098145 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea81b69-95de-4772-b7bf-d48f52c298b1" containerName="glance-db-sync" Dec 03 14:35:02 crc kubenswrapper[4751]: E1203 14:35:02.098213 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc533900-0d24-4473-83fe-7653f335a1a9" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.098264 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc533900-0d24-4473-83fe-7653f335a1a9" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: E1203 14:35:02.098339 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2fa3dba-8033-4b02-b75b-6052face2364" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.098439 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2fa3dba-8033-4b02-b75b-6052face2364" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: E1203 14:35:02.098519 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcae2e5-8f91-40b7-842a-990dd6b13c66" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.098599 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcae2e5-8f91-40b7-842a-990dd6b13c66" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: E1203 14:35:02.098661 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce7be68-ea3a-4d2f-b875-9db739a50a8e" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.098712 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce7be68-ea3a-4d2f-b875-9db739a50a8e" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.099088 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2fa3dba-8033-4b02-b75b-6052face2364" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.099357 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea54721-c31e-48c1-93fd-ba0b9efb5189" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.099431 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.099557 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcae2e5-8f91-40b7-842a-990dd6b13c66" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.099618 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65bb3b2-f470-47b3-9031-35a6f9dbc930" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.099670 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c838d53c-d56e-4cfc-a15a-fb91dccb5dbd" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.099722 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ea81b69-95de-4772-b7bf-d48f52c298b1" containerName="glance-db-sync" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.099778 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc533900-0d24-4473-83fe-7653f335a1a9" containerName="mariadb-account-create-update" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.099831 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce7be68-ea3a-4d2f-b875-9db739a50a8e" containerName="mariadb-database-create" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.100892 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.105939 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-p8nj5"] Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.208858 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.208937 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwdn6\" (UniqueName: \"kubernetes.io/projected/6637d261-2595-4256-9fc1-340de8705887-kube-api-access-mwdn6\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.208972 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.209002 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.209018 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-config\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.209068 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.310592 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.310677 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.310729 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwdn6\" (UniqueName: \"kubernetes.io/projected/6637d261-2595-4256-9fc1-340de8705887-kube-api-access-mwdn6\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.310758 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.310787 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.310804 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-config\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.311641 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.311676 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.311858 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-config\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.312215 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.312615 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.348162 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwdn6\" (UniqueName: \"kubernetes.io/projected/6637d261-2595-4256-9fc1-340de8705887-kube-api-access-mwdn6\") pod \"dnsmasq-dns-5f59b8f679-p8nj5\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.428197 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:02 crc kubenswrapper[4751]: I1203 14:35:02.983947 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-p8nj5"] Dec 03 14:35:03 crc kubenswrapper[4751]: I1203 14:35:03.015642 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" event={"ID":"6637d261-2595-4256-9fc1-340de8705887","Type":"ContainerStarted","Data":"ac67934c6fb87fef5a3c2b0e867b6acece120811e70f0ad3e1762d24958b831a"} Dec 03 14:35:04 crc kubenswrapper[4751]: I1203 14:35:04.025419 4751 generic.go:334] "Generic (PLEG): container finished" podID="6637d261-2595-4256-9fc1-340de8705887" containerID="08a415a2df2c2b59ecbcb2796c8d4f9ba3e36ed22729cf98004c8b381512e6e4" exitCode=0 Dec 03 14:35:04 crc kubenswrapper[4751]: I1203 14:35:04.025864 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" podUID="ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" containerName="dnsmasq-dns" containerID="cri-o://cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317" gracePeriod=10 Dec 03 14:35:04 crc kubenswrapper[4751]: I1203 14:35:04.025573 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" event={"ID":"6637d261-2595-4256-9fc1-340de8705887","Type":"ContainerDied","Data":"08a415a2df2c2b59ecbcb2796c8d4f9ba3e36ed22729cf98004c8b381512e6e4"} Dec 03 14:35:04 crc kubenswrapper[4751]: I1203 14:35:04.996088 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.042401 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" event={"ID":"6637d261-2595-4256-9fc1-340de8705887","Type":"ContainerStarted","Data":"2fb3e28d9576d78bbdd193bb85fd367facc4b9229aa28bd2d6eae9cbf7954e40"} Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.043514 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.049335 4751 generic.go:334] "Generic (PLEG): container finished" podID="ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" containerID="cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317" exitCode=0 Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.049369 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" event={"ID":"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30","Type":"ContainerDied","Data":"cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317"} Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.049393 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" event={"ID":"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30","Type":"ContainerDied","Data":"583df8103bff55c7875e06aeb240357324b006e34931e49555eed9da3704e8e1"} Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.049413 4751 scope.go:117] "RemoveContainer" containerID="cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.049408 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-xk8jg" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.071062 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" podStartSLOduration=3.07103803 podStartE2EDuration="3.07103803s" podCreationTimestamp="2025-12-03 14:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:05.066089495 +0000 UTC m=+1312.054444702" watchObservedRunningTime="2025-12-03 14:35:05.07103803 +0000 UTC m=+1312.059393247" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.111901 4751 scope.go:117] "RemoveContainer" containerID="cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.137560 4751 scope.go:117] "RemoveContainer" containerID="cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317" Dec 03 14:35:05 crc kubenswrapper[4751]: E1203 14:35:05.137947 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317\": container with ID starting with cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317 not found: ID does not exist" containerID="cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.138018 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317"} err="failed to get container status \"cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317\": rpc error: code = NotFound desc = could not find container \"cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317\": container with ID starting with cb6b804864200e0143281bab2226ad8ad6022efdb0e3aa277de19369ca5f5317 not found: ID does not exist" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.138051 4751 scope.go:117] "RemoveContainer" containerID="cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5" Dec 03 14:35:05 crc kubenswrapper[4751]: E1203 14:35:05.138463 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5\": container with ID starting with cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5 not found: ID does not exist" containerID="cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.138506 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5"} err="failed to get container status \"cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5\": rpc error: code = NotFound desc = could not find container \"cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5\": container with ID starting with cf7cc620942893e76328e6d607d83e4c678d68176ee6e7d81851fa2788c510a5 not found: ID does not exist" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.178828 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j869f\" (UniqueName: \"kubernetes.io/projected/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-kube-api-access-j869f\") pod \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.178940 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-sb\") pod \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.179042 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-nb\") pod \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.179156 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-config\") pod \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.179209 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-swift-storage-0\") pod \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.179302 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-svc\") pod \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\" (UID: \"ffd30bdc-ab3e-43c5-872f-c7a9292e2d30\") " Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.186455 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-kube-api-access-j869f" (OuterVolumeSpecName: "kube-api-access-j869f") pod "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" (UID: "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30"). InnerVolumeSpecName "kube-api-access-j869f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.232086 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-config" (OuterVolumeSpecName: "config") pod "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" (UID: "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.237188 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" (UID: "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.242831 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" (UID: "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.243382 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" (UID: "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.250261 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" (UID: "ffd30bdc-ab3e-43c5-872f-c7a9292e2d30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.282830 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j869f\" (UniqueName: \"kubernetes.io/projected/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-kube-api-access-j869f\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.282867 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.282876 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.282886 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.282895 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.282903 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.375224 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xk8jg"] Dec 03 14:35:05 crc kubenswrapper[4751]: I1203 14:35:05.384064 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xk8jg"] Dec 03 14:35:06 crc kubenswrapper[4751]: I1203 14:35:06.068138 4751 generic.go:334] "Generic (PLEG): container finished" podID="124c2c16-935c-4bfc-9b53-11ffa90ed441" containerID="38ad4756d3448098303991c1251eca0f74cefb44f1e69fbf6ae5e87305ba54d1" exitCode=0 Dec 03 14:35:06 crc kubenswrapper[4751]: I1203 14:35:06.068233 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cgd8h" event={"ID":"124c2c16-935c-4bfc-9b53-11ffa90ed441","Type":"ContainerDied","Data":"38ad4756d3448098303991c1251eca0f74cefb44f1e69fbf6ae5e87305ba54d1"} Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.330488 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" path="/var/lib/kubelet/pods/ffd30bdc-ab3e-43c5-872f-c7a9292e2d30/volumes" Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.512843 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.518487 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-combined-ca-bundle\") pod \"124c2c16-935c-4bfc-9b53-11ffa90ed441\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.518527 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86kms\" (UniqueName: \"kubernetes.io/projected/124c2c16-935c-4bfc-9b53-11ffa90ed441-kube-api-access-86kms\") pod \"124c2c16-935c-4bfc-9b53-11ffa90ed441\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.518572 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-config-data\") pod \"124c2c16-935c-4bfc-9b53-11ffa90ed441\" (UID: \"124c2c16-935c-4bfc-9b53-11ffa90ed441\") " Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.524482 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124c2c16-935c-4bfc-9b53-11ffa90ed441-kube-api-access-86kms" (OuterVolumeSpecName: "kube-api-access-86kms") pod "124c2c16-935c-4bfc-9b53-11ffa90ed441" (UID: "124c2c16-935c-4bfc-9b53-11ffa90ed441"). InnerVolumeSpecName "kube-api-access-86kms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.563472 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "124c2c16-935c-4bfc-9b53-11ffa90ed441" (UID: "124c2c16-935c-4bfc-9b53-11ffa90ed441"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.573143 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-config-data" (OuterVolumeSpecName: "config-data") pod "124c2c16-935c-4bfc-9b53-11ffa90ed441" (UID: "124c2c16-935c-4bfc-9b53-11ffa90ed441"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.620201 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.620246 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86kms\" (UniqueName: \"kubernetes.io/projected/124c2c16-935c-4bfc-9b53-11ffa90ed441-kube-api-access-86kms\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:07 crc kubenswrapper[4751]: I1203 14:35:07.620261 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124c2c16-935c-4bfc-9b53-11ffa90ed441-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.092647 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cgd8h" event={"ID":"124c2c16-935c-4bfc-9b53-11ffa90ed441","Type":"ContainerDied","Data":"bbfb1d71b56439532a615fdf59e0894753a41b88606721b65d839a12d6782de1"} Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.092957 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbfb1d71b56439532a615fdf59e0894753a41b88606721b65d839a12d6782de1" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.092894 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cgd8h" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.365510 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-p8nj5"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.365814 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" podUID="6637d261-2595-4256-9fc1-340de8705887" containerName="dnsmasq-dns" containerID="cri-o://2fb3e28d9576d78bbdd193bb85fd367facc4b9229aa28bd2d6eae9cbf7954e40" gracePeriod=10 Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.392402 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-86xd9"] Dec 03 14:35:08 crc kubenswrapper[4751]: E1203 14:35:08.392864 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" containerName="dnsmasq-dns" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.392882 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" containerName="dnsmasq-dns" Dec 03 14:35:08 crc kubenswrapper[4751]: E1203 14:35:08.392903 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124c2c16-935c-4bfc-9b53-11ffa90ed441" containerName="keystone-db-sync" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.392909 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="124c2c16-935c-4bfc-9b53-11ffa90ed441" containerName="keystone-db-sync" Dec 03 14:35:08 crc kubenswrapper[4751]: E1203 14:35:08.392926 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" containerName="init" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.392932 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" containerName="init" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.393099 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="124c2c16-935c-4bfc-9b53-11ffa90ed441" containerName="keystone-db-sync" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.393126 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd30bdc-ab3e-43c5-872f-c7a9292e2d30" containerName="dnsmasq-dns" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.394232 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.421447 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r6rzp"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.422997 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.438218 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.438531 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.438650 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2bmt6" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.440781 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.440964 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.473395 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-86xd9"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.505131 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r6rzp"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551186 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551242 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551266 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-config-data\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551308 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-combined-ca-bundle\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551353 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bxp4\" (UniqueName: \"kubernetes.io/projected/0bcaaedc-9d9b-403a-946f-ef7ab8176360-kube-api-access-8bxp4\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551378 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551408 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-credential-keys\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551435 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-config\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551507 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-scripts\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551528 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551563 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-fernet-keys\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.551597 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhl8l\" (UniqueName: \"kubernetes.io/projected/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-kube-api-access-qhl8l\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.653420 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-config\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.653955 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-scripts\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654023 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654111 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-fernet-keys\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654194 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhl8l\" (UniqueName: \"kubernetes.io/projected/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-kube-api-access-qhl8l\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654307 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654388 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654448 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-config-data\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654551 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-combined-ca-bundle\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654628 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bxp4\" (UniqueName: \"kubernetes.io/projected/0bcaaedc-9d9b-403a-946f-ef7ab8176360-kube-api-access-8bxp4\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654693 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654789 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-credential-keys\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.655383 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.656681 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.654471 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-config\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.660213 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-combined-ca-bundle\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.660822 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-credential-keys\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.665075 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.665507 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.666594 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-config-data\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.669386 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-fernet-keys\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.673436 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-scripts\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.688764 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-5qnpr"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.690606 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.691806 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bxp4\" (UniqueName: \"kubernetes.io/projected/0bcaaedc-9d9b-403a-946f-ef7ab8176360-kube-api-access-8bxp4\") pod \"dnsmasq-dns-bbf5cc879-86xd9\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.708948 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.709202 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lwkf5" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.709343 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.712681 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.722007 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-g4glx"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.722178 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhl8l\" (UniqueName: \"kubernetes.io/projected/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-kube-api-access-qhl8l\") pod \"keystone-bootstrap-r6rzp\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.723233 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.730933 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m4hm9" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.731208 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.731372 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.739283 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-5qnpr"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.755844 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.758142 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-config-data\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.758185 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjdz6\" (UniqueName: \"kubernetes.io/projected/4784bf8d-4315-4097-b729-1f21940a17bc-kube-api-access-cjdz6\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.758226 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm2hs\" (UniqueName: \"kubernetes.io/projected/44d1508a-da0e-46be-9f1d-583e1be8d864-kube-api-access-pm2hs\") pod \"neutron-db-sync-g4glx\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.758275 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-db-sync-config-data\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.758301 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-combined-ca-bundle\") pod \"neutron-db-sync-g4glx\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.758373 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-config\") pod \"neutron-db-sync-g4glx\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.758406 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-scripts\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.758494 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-combined-ca-bundle\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.758521 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4784bf8d-4315-4097-b729-1f21940a17bc-etc-machine-id\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.767362 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g4glx"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.851402 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wbwn4"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.853003 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.859848 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-combined-ca-bundle\") pod \"barbican-db-sync-wbwn4\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.859921 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-combined-ca-bundle\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.859944 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-db-sync-config-data\") pod \"barbican-db-sync-wbwn4\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.859967 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4784bf8d-4315-4097-b729-1f21940a17bc-etc-machine-id\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.860040 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w52mq\" (UniqueName: \"kubernetes.io/projected/cbe09550-cc72-4fe9-af45-b39fbcac540d-kube-api-access-w52mq\") pod \"barbican-db-sync-wbwn4\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.860062 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-config-data\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.860084 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjdz6\" (UniqueName: \"kubernetes.io/projected/4784bf8d-4315-4097-b729-1f21940a17bc-kube-api-access-cjdz6\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.860122 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2hs\" (UniqueName: \"kubernetes.io/projected/44d1508a-da0e-46be-9f1d-583e1be8d864-kube-api-access-pm2hs\") pod \"neutron-db-sync-g4glx\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.860161 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-db-sync-config-data\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.860192 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-combined-ca-bundle\") pod \"neutron-db-sync-g4glx\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.860214 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-config\") pod \"neutron-db-sync-g4glx\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.860240 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-scripts\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.864039 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.864774 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4784bf8d-4315-4097-b729-1f21940a17bc-etc-machine-id\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.870879 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-c66qs" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.893803 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-db-sync-config-data\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.897376 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-bgs8x"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.898704 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.912426 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.924874 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.925209 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.933498 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-86xd9"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.933839 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2hs\" (UniqueName: \"kubernetes.io/projected/44d1508a-da0e-46be-9f1d-583e1be8d864-kube-api-access-pm2hs\") pod \"neutron-db-sync-g4glx\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.934929 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjdz6\" (UniqueName: \"kubernetes.io/projected/4784bf8d-4315-4097-b729-1f21940a17bc-kube-api-access-cjdz6\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.935654 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-combined-ca-bundle\") pod \"neutron-db-sync-g4glx\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.941945 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-combined-ca-bundle\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.942006 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-scripts\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.942557 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-config-data\") pod \"cinder-db-sync-5qnpr\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.949805 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-mdxtz" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.963697 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.964634 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-config\") pod \"neutron-db-sync-g4glx\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.964889 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w52mq\" (UniqueName: \"kubernetes.io/projected/cbe09550-cc72-4fe9-af45-b39fbcac540d-kube-api-access-w52mq\") pod \"barbican-db-sync-wbwn4\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.965019 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-combined-ca-bundle\") pod \"barbican-db-sync-wbwn4\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.965043 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-db-sync-config-data\") pod \"barbican-db-sync-wbwn4\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.973090 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-combined-ca-bundle\") pod \"barbican-db-sync-wbwn4\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.986883 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.987288 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wbwn4"] Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.989673 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-db-sync-config-data\") pod \"barbican-db-sync-wbwn4\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:08 crc kubenswrapper[4751]: I1203 14:35:08.991086 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w52mq\" (UniqueName: \"kubernetes.io/projected/cbe09550-cc72-4fe9-af45-b39fbcac540d-kube-api-access-w52mq\") pod \"barbican-db-sync-wbwn4\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.001354 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.022676 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-bgs8x"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.031583 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-l4676"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.032956 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.038389 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4pjzg" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.038633 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.038857 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.058402 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.066455 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070180 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070248 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-scripts\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070376 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-certs\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070401 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-combined-ca-bundle\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070457 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-config-data\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070480 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-config-data\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070530 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070558 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckrsx\" (UniqueName: \"kubernetes.io/projected/7e2515a0-a5d9-4c4a-b686-1d011708c96d-kube-api-access-ckrsx\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070580 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-scripts\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070617 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b52c852-92ed-47f2-8e47-a9ac1e378698-logs\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070644 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-combined-ca-bundle\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070668 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57q8w\" (UniqueName: \"kubernetes.io/projected/6b52c852-92ed-47f2-8e47-a9ac1e378698-kube-api-access-57q8w\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070701 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-log-httpd\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070731 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcplz\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-kube-api-access-zcplz\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070766 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-run-httpd\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070814 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-config-data\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.070838 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-scripts\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.075715 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.075784 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.077917 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mj9pc"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.080093 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.102167 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-l4676"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.157821 4751 generic.go:334] "Generic (PLEG): container finished" podID="6637d261-2595-4256-9fc1-340de8705887" containerID="2fb3e28d9576d78bbdd193bb85fd367facc4b9229aa28bd2d6eae9cbf7954e40" exitCode=0 Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.157858 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" event={"ID":"6637d261-2595-4256-9fc1-340de8705887","Type":"ContainerDied","Data":"2fb3e28d9576d78bbdd193bb85fd367facc4b9229aa28bd2d6eae9cbf7954e40"} Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.172130 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mj9pc"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173429 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-log-httpd\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173470 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-config\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173490 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcplz\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-kube-api-access-zcplz\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173518 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-run-httpd\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173533 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173567 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-config-data\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173587 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-scripts\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173638 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173666 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173693 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173727 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-scripts\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173783 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hcvs\" (UniqueName: \"kubernetes.io/projected/276feef5-736d-4d00-a9af-e81eaf2a0285-kube-api-access-7hcvs\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173810 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-certs\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173833 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-combined-ca-bundle\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173866 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173908 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-config-data\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173930 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-config-data\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173952 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.173984 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckrsx\" (UniqueName: \"kubernetes.io/projected/7e2515a0-a5d9-4c4a-b686-1d011708c96d-kube-api-access-ckrsx\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.174004 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-scripts\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.174030 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b52c852-92ed-47f2-8e47-a9ac1e378698-logs\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.174055 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-combined-ca-bundle\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.174079 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57q8w\" (UniqueName: \"kubernetes.io/projected/6b52c852-92ed-47f2-8e47-a9ac1e378698-kube-api-access-57q8w\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.178488 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b52c852-92ed-47f2-8e47-a9ac1e378698-logs\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.178727 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-scripts\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.179346 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-scripts\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.179405 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-log-httpd\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.181276 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-run-httpd\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.183916 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.185020 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-certs\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.186891 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-combined-ca-bundle\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.189831 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-config-data\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.202302 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-config-data\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.204152 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.206022 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-combined-ca-bundle\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.213582 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-config-data\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.255189 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-scripts\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.265961 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.270776 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcplz\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-kube-api-access-zcplz\") pod \"cloudkitty-db-sync-bgs8x\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.274728 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57q8w\" (UniqueName: \"kubernetes.io/projected/6b52c852-92ed-47f2-8e47-a9ac1e378698-kube-api-access-57q8w\") pod \"placement-db-sync-l4676\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.279357 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.279413 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.279522 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hcvs\" (UniqueName: \"kubernetes.io/projected/276feef5-736d-4d00-a9af-e81eaf2a0285-kube-api-access-7hcvs\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.279568 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.279702 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-config\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.279749 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.280647 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.323021 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.323816 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-config\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.324460 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.324624 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.326655 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckrsx\" (UniqueName: \"kubernetes.io/projected/7e2515a0-a5d9-4c4a-b686-1d011708c96d-kube-api-access-ckrsx\") pod \"ceilometer-0\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.368968 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.407695 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hcvs\" (UniqueName: \"kubernetes.io/projected/276feef5-736d-4d00-a9af-e81eaf2a0285-kube-api-access-7hcvs\") pod \"dnsmasq-dns-56df8fb6b7-mj9pc\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.407913 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.408534 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l4676" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.439032 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.496593 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-86xd9"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.573398 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.575909 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.583192 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zbls2" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.583652 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.583890 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.584247 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.618457 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.618765 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-logs\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.618858 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-scripts\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.618972 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.619068 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.619210 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-config-data\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.619302 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.619437 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74cw9\" (UniqueName: \"kubernetes.io/projected/1835bd6b-042f-4df1-b692-544a2765e20c-kube-api-access-74cw9\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.624860 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.691176 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.693695 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.697698 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.698034 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.721544 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-logs\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.721775 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-scripts\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.721990 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.722131 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.722304 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-config-data\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.722427 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.722566 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74cw9\" (UniqueName: \"kubernetes.io/projected/1835bd6b-042f-4df1-b692-544a2765e20c-kube-api-access-74cw9\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.722701 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.722820 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-logs\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.725297 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.728385 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-scripts\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.738076 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.740461 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.746763 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-config-data\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.761095 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.761140 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/683a5c301520fbc15dbc2ee54d0de9b6296d1615a3bef9c1741aa8a387a031dd/globalmount\"" pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.768997 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74cw9\" (UniqueName: \"kubernetes.io/projected/1835bd6b-042f-4df1-b692-544a2765e20c-kube-api-access-74cw9\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.771715 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.825891 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.826004 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.826072 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.826104 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.826156 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.826181 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.826247 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-logs\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.826316 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clrsj\" (UniqueName: \"kubernetes.io/projected/7885fce3-109e-41cb-8b8d-b2ce9200d036-kube-api-access-clrsj\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.845496 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.845535 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r6rzp"] Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.932356 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.932435 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.932508 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.932535 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.932584 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.932601 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.932639 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-logs\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.932682 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clrsj\" (UniqueName: \"kubernetes.io/projected/7885fce3-109e-41cb-8b8d-b2ce9200d036-kube-api-access-clrsj\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.933957 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-logs\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.933960 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.942989 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.943023 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d3b26c55de8c52fd1f3bf024792f2005b72c5292706e21196d4a11b13179d08d/globalmount\"" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.945371 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.953814 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.959489 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.963265 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clrsj\" (UniqueName: \"kubernetes.io/projected/7885fce3-109e-41cb-8b8d-b2ce9200d036-kube-api-access-clrsj\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.966546 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:09 crc kubenswrapper[4751]: I1203 14:35:09.968771 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.047055 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.102434 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.123009 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.137971 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-swift-storage-0\") pod \"6637d261-2595-4256-9fc1-340de8705887\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.138004 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwdn6\" (UniqueName: \"kubernetes.io/projected/6637d261-2595-4256-9fc1-340de8705887-kube-api-access-mwdn6\") pod \"6637d261-2595-4256-9fc1-340de8705887\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.138129 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-svc\") pod \"6637d261-2595-4256-9fc1-340de8705887\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.138159 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-sb\") pod \"6637d261-2595-4256-9fc1-340de8705887\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.138278 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-nb\") pod \"6637d261-2595-4256-9fc1-340de8705887\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.138408 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-config\") pod \"6637d261-2595-4256-9fc1-340de8705887\" (UID: \"6637d261-2595-4256-9fc1-340de8705887\") " Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.211906 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" event={"ID":"6637d261-2595-4256-9fc1-340de8705887","Type":"ContainerDied","Data":"ac67934c6fb87fef5a3c2b0e867b6acece120811e70f0ad3e1762d24958b831a"} Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.211973 4751 scope.go:117] "RemoveContainer" containerID="2fb3e28d9576d78bbdd193bb85fd367facc4b9229aa28bd2d6eae9cbf7954e40" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.212071 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-p8nj5" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.215239 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-5qnpr"] Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.222615 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r6rzp" event={"ID":"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3","Type":"ContainerStarted","Data":"07722344ca43acb85095e497c24b27593f0a8bea9f43ee636c8cd0cf9fc6030c"} Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.228177 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6637d261-2595-4256-9fc1-340de8705887-kube-api-access-mwdn6" (OuterVolumeSpecName: "kube-api-access-mwdn6") pod "6637d261-2595-4256-9fc1-340de8705887" (UID: "6637d261-2595-4256-9fc1-340de8705887"). InnerVolumeSpecName "kube-api-access-mwdn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.228702 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" event={"ID":"0bcaaedc-9d9b-403a-946f-ef7ab8176360","Type":"ContainerStarted","Data":"aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e"} Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.228759 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" event={"ID":"0bcaaedc-9d9b-403a-946f-ef7ab8176360","Type":"ContainerStarted","Data":"6c43de21718dca05ea4ec74d7bf9f07adac44a3ce90cdbd9c552e2bab0e03555"} Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.240732 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwdn6\" (UniqueName: \"kubernetes.io/projected/6637d261-2595-4256-9fc1-340de8705887-kube-api-access-mwdn6\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:10 crc kubenswrapper[4751]: W1203 14:35:10.284034 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4784bf8d_4315_4097_b729_1f21940a17bc.slice/crio-14e3dbc94365a434c8cb783c952ce106cab41956f910790139b2c65f32a8a218 WatchSource:0}: Error finding container 14e3dbc94365a434c8cb783c952ce106cab41956f910790139b2c65f32a8a218: Status 404 returned error can't find the container with id 14e3dbc94365a434c8cb783c952ce106cab41956f910790139b2c65f32a8a218 Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.305950 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wbwn4"] Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.314123 4751 scope.go:117] "RemoveContainer" containerID="08a415a2df2c2b59ecbcb2796c8d4f9ba3e36ed22729cf98004c8b381512e6e4" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.325091 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g4glx"] Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.490015 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.514070 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6637d261-2595-4256-9fc1-340de8705887" (UID: "6637d261-2595-4256-9fc1-340de8705887"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.516627 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6637d261-2595-4256-9fc1-340de8705887" (UID: "6637d261-2595-4256-9fc1-340de8705887"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.519748 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-config" (OuterVolumeSpecName: "config") pod "6637d261-2595-4256-9fc1-340de8705887" (UID: "6637d261-2595-4256-9fc1-340de8705887"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.554934 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.554966 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.554977 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.561021 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6637d261-2595-4256-9fc1-340de8705887" (UID: "6637d261-2595-4256-9fc1-340de8705887"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.565277 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6637d261-2595-4256-9fc1-340de8705887" (UID: "6637d261-2595-4256-9fc1-340de8705887"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.656630 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.656855 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6637d261-2595-4256-9fc1-340de8705887-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.883298 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-bgs8x"] Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.902489 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.930381 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-l4676"] Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.952915 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-p8nj5"] Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.952961 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-p8nj5"] Dec 03 14:35:10 crc kubenswrapper[4751]: I1203 14:35:10.977723 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mj9pc"] Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.131683 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-svc\") pod \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.132057 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bxp4\" (UniqueName: \"kubernetes.io/projected/0bcaaedc-9d9b-403a-946f-ef7ab8176360-kube-api-access-8bxp4\") pod \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.132761 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-sb\") pod \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.132797 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-swift-storage-0\") pod \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.132901 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-config\") pod \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.132921 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-nb\") pod \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\" (UID: \"0bcaaedc-9d9b-403a-946f-ef7ab8176360\") " Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.156047 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bcaaedc-9d9b-403a-946f-ef7ab8176360-kube-api-access-8bxp4" (OuterVolumeSpecName: "kube-api-access-8bxp4") pod "0bcaaedc-9d9b-403a-946f-ef7ab8176360" (UID: "0bcaaedc-9d9b-403a-946f-ef7ab8176360"). InnerVolumeSpecName "kube-api-access-8bxp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.172266 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0bcaaedc-9d9b-403a-946f-ef7ab8176360" (UID: "0bcaaedc-9d9b-403a-946f-ef7ab8176360"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.212808 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0bcaaedc-9d9b-403a-946f-ef7ab8176360" (UID: "0bcaaedc-9d9b-403a-946f-ef7ab8176360"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.230567 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-config" (OuterVolumeSpecName: "config") pod "0bcaaedc-9d9b-403a-946f-ef7ab8176360" (UID: "0bcaaedc-9d9b-403a-946f-ef7ab8176360"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.237398 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0bcaaedc-9d9b-403a-946f-ef7ab8176360" (UID: "0bcaaedc-9d9b-403a-946f-ef7ab8176360"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.238970 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.259126 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.259210 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bxp4\" (UniqueName: \"kubernetes.io/projected/0bcaaedc-9d9b-403a-946f-ef7ab8176360-kube-api-access-8bxp4\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.259230 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.259242 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.259254 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.261287 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0bcaaedc-9d9b-403a-946f-ef7ab8176360" (UID: "0bcaaedc-9d9b-403a-946f-ef7ab8176360"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.263732 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r6rzp" event={"ID":"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3","Type":"ContainerStarted","Data":"fd4ad307c6770e8a8b0b6725bcb1ec36e7b9ae146d3b55050e11d998dcbff3ff"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.274788 4751 generic.go:334] "Generic (PLEG): container finished" podID="0bcaaedc-9d9b-403a-946f-ef7ab8176360" containerID="aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e" exitCode=0 Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.274879 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" event={"ID":"0bcaaedc-9d9b-403a-946f-ef7ab8176360","Type":"ContainerDied","Data":"aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.274905 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" event={"ID":"0bcaaedc-9d9b-403a-946f-ef7ab8176360","Type":"ContainerDied","Data":"6c43de21718dca05ea4ec74d7bf9f07adac44a3ce90cdbd9c552e2bab0e03555"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.274936 4751 scope.go:117] "RemoveContainer" containerID="aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.275108 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-86xd9" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.295309 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r6rzp" podStartSLOduration=3.295284743 podStartE2EDuration="3.295284743s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:11.280203432 +0000 UTC m=+1318.268558649" watchObservedRunningTime="2025-12-03 14:35:11.295284743 +0000 UTC m=+1318.283639960" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.295388 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g4glx" event={"ID":"44d1508a-da0e-46be-9f1d-583e1be8d864","Type":"ContainerStarted","Data":"e5f89e275c7378fd8bc629e5f986d230a3cb73a1eb06035c5a68369abfcfa9f0"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.301459 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g4glx" event={"ID":"44d1508a-da0e-46be-9f1d-583e1be8d864","Type":"ContainerStarted","Data":"f8e42fb04b082e54710eedbb16fe97a008c5e553272660c8982c1e4b72c51ba1"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.312839 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wbwn4" event={"ID":"cbe09550-cc72-4fe9-af45-b39fbcac540d","Type":"ContainerStarted","Data":"85728848ee55c4a068dc3edb62bcd0dfa0363666e541a0b4ce8801210725d820"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.329815 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-g4glx" podStartSLOduration=3.329781443 podStartE2EDuration="3.329781443s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:11.317275883 +0000 UTC m=+1318.305631100" watchObservedRunningTime="2025-12-03 14:35:11.329781443 +0000 UTC m=+1318.318136660" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.340916 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6637d261-2595-4256-9fc1-340de8705887" path="/var/lib/kubelet/pods/6637d261-2595-4256-9fc1-340de8705887/volumes" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.341628 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerStarted","Data":"fbdc647375981f62a924f17bba262778186a81edfed0b16f453ca52d691a6bb9"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.341657 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5qnpr" event={"ID":"4784bf8d-4315-4097-b729-1f21940a17bc","Type":"ContainerStarted","Data":"14e3dbc94365a434c8cb783c952ce106cab41956f910790139b2c65f32a8a218"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.349878 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l4676" event={"ID":"6b52c852-92ed-47f2-8e47-a9ac1e378698","Type":"ContainerStarted","Data":"a681677b3b4ecb206685fb244b1d6346ed40766228d6b26c74bc7b7ff37acb94"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.376221 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bcaaedc-9d9b-403a-946f-ef7ab8176360-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.402752 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" event={"ID":"276feef5-736d-4d00-a9af-e81eaf2a0285","Type":"ContainerStarted","Data":"6b787dd6f7139cc6c338e83681dd76c71327c3c40b4fe4d5404a7ae362465c60"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.405532 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-bgs8x" event={"ID":"4df7d14f-4a52-43be-9877-c5df9c015cc7","Type":"ContainerStarted","Data":"8eb90a04b5fe0caa0613ada1569a95c8003a835945534bb50c59e00e8ec1f479"} Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.417711 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-86xd9"] Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.429597 4751 scope.go:117] "RemoveContainer" containerID="aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e" Dec 03 14:35:11 crc kubenswrapper[4751]: E1203 14:35:11.432917 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e\": container with ID starting with aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e not found: ID does not exist" containerID="aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.432958 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e"} err="failed to get container status \"aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e\": rpc error: code = NotFound desc = could not find container \"aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e\": container with ID starting with aa5b36bb7f78e7f0ab47e8126db69fbec1cedfa034ba48013b6c617725ef660e not found: ID does not exist" Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.451906 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-86xd9"] Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.652790 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.751224 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.812597 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:35:11 crc kubenswrapper[4751]: I1203 14:35:11.972646 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:35:12 crc kubenswrapper[4751]: I1203 14:35:12.474450 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1835bd6b-042f-4df1-b692-544a2765e20c","Type":"ContainerStarted","Data":"2e9446c3ffd358ade23f81e2d72fbace7f2b98f911d259641028b8bbe7dbfd9b"} Dec 03 14:35:12 crc kubenswrapper[4751]: I1203 14:35:12.491549 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7885fce3-109e-41cb-8b8d-b2ce9200d036","Type":"ContainerStarted","Data":"990714c95b91426dcbe678dd41bff790d5c9e603ec675c201ab4b2f10b466533"} Dec 03 14:35:13 crc kubenswrapper[4751]: I1203 14:35:13.395185 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bcaaedc-9d9b-403a-946f-ef7ab8176360" path="/var/lib/kubelet/pods/0bcaaedc-9d9b-403a-946f-ef7ab8176360/volumes" Dec 03 14:35:15 crc kubenswrapper[4751]: I1203 14:35:15.540299 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1835bd6b-042f-4df1-b692-544a2765e20c","Type":"ContainerStarted","Data":"3e4d2f0875cd00f62dab0ed1c96bd3acd358a969220a9764cb5c0978471bf2f7"} Dec 03 14:35:15 crc kubenswrapper[4751]: I1203 14:35:15.552180 4751 generic.go:334] "Generic (PLEG): container finished" podID="276feef5-736d-4d00-a9af-e81eaf2a0285" containerID="009618b36cec9e7aac50b9da194717c12033e1fab63046c3c865e34643526b68" exitCode=0 Dec 03 14:35:15 crc kubenswrapper[4751]: I1203 14:35:15.552246 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" event={"ID":"276feef5-736d-4d00-a9af-e81eaf2a0285","Type":"ContainerDied","Data":"009618b36cec9e7aac50b9da194717c12033e1fab63046c3c865e34643526b68"} Dec 03 14:35:15 crc kubenswrapper[4751]: I1203 14:35:15.594351 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7885fce3-109e-41cb-8b8d-b2ce9200d036","Type":"ContainerStarted","Data":"c2c84c98688cfa061a82e7f8c31682383dfff608f36cb82696022a2e4b6a0528"} Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.614454 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7885fce3-109e-41cb-8b8d-b2ce9200d036","Type":"ContainerStarted","Data":"89bb5175d76c1f666da9960b1defa85471308461a788d486f9542dbdebad378d"} Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.614887 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerName="glance-log" containerID="cri-o://c2c84c98688cfa061a82e7f8c31682383dfff608f36cb82696022a2e4b6a0528" gracePeriod=30 Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.615267 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerName="glance-httpd" containerID="cri-o://89bb5175d76c1f666da9960b1defa85471308461a788d486f9542dbdebad378d" gracePeriod=30 Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.621564 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1835bd6b-042f-4df1-b692-544a2765e20c","Type":"ContainerStarted","Data":"93cae7b9d330d8f8e8787e04bcf651198522b823f752c4f8f57ffd7d422c0ad8"} Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.621677 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1835bd6b-042f-4df1-b692-544a2765e20c" containerName="glance-log" containerID="cri-o://3e4d2f0875cd00f62dab0ed1c96bd3acd358a969220a9764cb5c0978471bf2f7" gracePeriod=30 Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.621837 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1835bd6b-042f-4df1-b692-544a2765e20c" containerName="glance-httpd" containerID="cri-o://93cae7b9d330d8f8e8787e04bcf651198522b823f752c4f8f57ffd7d422c0ad8" gracePeriod=30 Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.624178 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" event={"ID":"276feef5-736d-4d00-a9af-e81eaf2a0285","Type":"ContainerStarted","Data":"057d49ea4b08ade87c3d87bdb552c83904d430fa181aecafdc11fb3f9da399bb"} Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.624483 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.639685 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.639668484 podStartE2EDuration="8.639668484s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:16.632835988 +0000 UTC m=+1323.621191205" watchObservedRunningTime="2025-12-03 14:35:16.639668484 +0000 UTC m=+1323.628023701" Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.663475 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" podStartSLOduration=8.663458072 podStartE2EDuration="8.663458072s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:16.662800094 +0000 UTC m=+1323.651155321" watchObservedRunningTime="2025-12-03 14:35:16.663458072 +0000 UTC m=+1323.651813289" Dec 03 14:35:16 crc kubenswrapper[4751]: I1203 14:35:16.684644 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.684622249 podStartE2EDuration="8.684622249s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:16.679954882 +0000 UTC m=+1323.668310119" watchObservedRunningTime="2025-12-03 14:35:16.684622249 +0000 UTC m=+1323.672977476" Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.637167 4751 generic.go:334] "Generic (PLEG): container finished" podID="81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" containerID="fd4ad307c6770e8a8b0b6725bcb1ec36e7b9ae146d3b55050e11d998dcbff3ff" exitCode=0 Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.637221 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r6rzp" event={"ID":"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3","Type":"ContainerDied","Data":"fd4ad307c6770e8a8b0b6725bcb1ec36e7b9ae146d3b55050e11d998dcbff3ff"} Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.642505 4751 generic.go:334] "Generic (PLEG): container finished" podID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerID="89bb5175d76c1f666da9960b1defa85471308461a788d486f9542dbdebad378d" exitCode=0 Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.642558 4751 generic.go:334] "Generic (PLEG): container finished" podID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerID="c2c84c98688cfa061a82e7f8c31682383dfff608f36cb82696022a2e4b6a0528" exitCode=143 Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.642577 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7885fce3-109e-41cb-8b8d-b2ce9200d036","Type":"ContainerDied","Data":"89bb5175d76c1f666da9960b1defa85471308461a788d486f9542dbdebad378d"} Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.642608 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7885fce3-109e-41cb-8b8d-b2ce9200d036","Type":"ContainerDied","Data":"c2c84c98688cfa061a82e7f8c31682383dfff608f36cb82696022a2e4b6a0528"} Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.644756 4751 generic.go:334] "Generic (PLEG): container finished" podID="1835bd6b-042f-4df1-b692-544a2765e20c" containerID="93cae7b9d330d8f8e8787e04bcf651198522b823f752c4f8f57ffd7d422c0ad8" exitCode=0 Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.644774 4751 generic.go:334] "Generic (PLEG): container finished" podID="1835bd6b-042f-4df1-b692-544a2765e20c" containerID="3e4d2f0875cd00f62dab0ed1c96bd3acd358a969220a9764cb5c0978471bf2f7" exitCode=143 Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.644833 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1835bd6b-042f-4df1-b692-544a2765e20c","Type":"ContainerDied","Data":"93cae7b9d330d8f8e8787e04bcf651198522b823f752c4f8f57ffd7d422c0ad8"} Dec 03 14:35:17 crc kubenswrapper[4751]: I1203 14:35:17.644849 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1835bd6b-042f-4df1-b692-544a2765e20c","Type":"ContainerDied","Data":"3e4d2f0875cd00f62dab0ed1c96bd3acd358a969220a9764cb5c0978471bf2f7"} Dec 03 14:35:24 crc kubenswrapper[4751]: I1203 14:35:24.441423 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:35:24 crc kubenswrapper[4751]: I1203 14:35:24.512615 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jdm6k"] Dec 03 14:35:24 crc kubenswrapper[4751]: I1203 14:35:24.512982 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="dnsmasq-dns" containerID="cri-o://6c2e401c92d038de3a392d5e003f097d04bdb2900597efb2d2acdade9546d339" gracePeriod=10 Dec 03 14:35:25 crc kubenswrapper[4751]: I1203 14:35:25.743690 4751 generic.go:334] "Generic (PLEG): container finished" podID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerID="6c2e401c92d038de3a392d5e003f097d04bdb2900597efb2d2acdade9546d339" exitCode=0 Dec 03 14:35:25 crc kubenswrapper[4751]: I1203 14:35:25.743798 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" event={"ID":"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58","Type":"ContainerDied","Data":"6c2e401c92d038de3a392d5e003f097d04bdb2900597efb2d2acdade9546d339"} Dec 03 14:35:26 crc kubenswrapper[4751]: I1203 14:35:26.711297 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.350052 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.523291 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-config-data\") pod \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.523902 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhl8l\" (UniqueName: \"kubernetes.io/projected/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-kube-api-access-qhl8l\") pod \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.523988 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-scripts\") pod \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.524085 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-combined-ca-bundle\") pod \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.524203 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-fernet-keys\") pod \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.524261 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-credential-keys\") pod \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\" (UID: \"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3\") " Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.530371 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" (UID: "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.530407 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-scripts" (OuterVolumeSpecName: "scripts") pod "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" (UID: "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.531456 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" (UID: "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.544832 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-kube-api-access-qhl8l" (OuterVolumeSpecName: "kube-api-access-qhl8l") pod "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" (UID: "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3"). InnerVolumeSpecName "kube-api-access-qhl8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.561485 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" (UID: "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.565475 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-config-data" (OuterVolumeSpecName: "config-data") pod "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" (UID: "81e62a7a-f0c4-4a29-9b0f-838c6573d5b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.626801 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.626840 4751 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.628398 4751 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.628448 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.628462 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhl8l\" (UniqueName: \"kubernetes.io/projected/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-kube-api-access-qhl8l\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.628477 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.780435 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r6rzp" event={"ID":"81e62a7a-f0c4-4a29-9b0f-838c6573d5b3","Type":"ContainerDied","Data":"07722344ca43acb85095e497c24b27593f0a8bea9f43ee636c8cd0cf9fc6030c"} Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.780473 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07722344ca43acb85095e497c24b27593f0a8bea9f43ee636c8cd0cf9fc6030c" Dec 03 14:35:27 crc kubenswrapper[4751]: I1203 14:35:27.780540 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r6rzp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.445892 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r6rzp"] Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.453836 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r6rzp"] Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.543793 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7kzxp"] Dec 03 14:35:28 crc kubenswrapper[4751]: E1203 14:35:28.544265 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bcaaedc-9d9b-403a-946f-ef7ab8176360" containerName="init" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.544286 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcaaedc-9d9b-403a-946f-ef7ab8176360" containerName="init" Dec 03 14:35:28 crc kubenswrapper[4751]: E1203 14:35:28.544311 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6637d261-2595-4256-9fc1-340de8705887" containerName="dnsmasq-dns" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.544345 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6637d261-2595-4256-9fc1-340de8705887" containerName="dnsmasq-dns" Dec 03 14:35:28 crc kubenswrapper[4751]: E1203 14:35:28.544397 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" containerName="keystone-bootstrap" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.544417 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" containerName="keystone-bootstrap" Dec 03 14:35:28 crc kubenswrapper[4751]: E1203 14:35:28.544442 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6637d261-2595-4256-9fc1-340de8705887" containerName="init" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.544454 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6637d261-2595-4256-9fc1-340de8705887" containerName="init" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.544718 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bcaaedc-9d9b-403a-946f-ef7ab8176360" containerName="init" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.544756 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" containerName="keystone-bootstrap" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.544774 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6637d261-2595-4256-9fc1-340de8705887" containerName="dnsmasq-dns" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.584060 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7kzxp"] Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.584119 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.586831 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.589666 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.589967 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.590160 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2bmt6" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.644046 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-combined-ca-bundle\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.644116 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-scripts\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.644147 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-credential-keys\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.644162 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrfh2\" (UniqueName: \"kubernetes.io/projected/de5612b5-9ee0-4da5-84b0-402ce8b1a163-kube-api-access-jrfh2\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.644213 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-fernet-keys\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.644235 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-config-data\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.745454 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-fernet-keys\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.745502 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-config-data\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.745592 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-combined-ca-bundle\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.745630 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-scripts\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.745656 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-credential-keys\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.745672 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrfh2\" (UniqueName: \"kubernetes.io/projected/de5612b5-9ee0-4da5-84b0-402ce8b1a163-kube-api-access-jrfh2\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.750340 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-fernet-keys\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.750502 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-scripts\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.750538 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-combined-ca-bundle\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.750829 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-credential-keys\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.751954 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-config-data\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.768503 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrfh2\" (UniqueName: \"kubernetes.io/projected/de5612b5-9ee0-4da5-84b0-402ce8b1a163-kube-api-access-jrfh2\") pod \"keystone-bootstrap-7kzxp\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:28 crc kubenswrapper[4751]: I1203 14:35:28.902213 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:35:29 crc kubenswrapper[4751]: I1203 14:35:29.324367 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e62a7a-f0c4-4a29-9b0f-838c6573d5b3" path="/var/lib/kubelet/pods/81e62a7a-f0c4-4a29-9b0f-838c6573d5b3/volumes" Dec 03 14:35:31 crc kubenswrapper[4751]: I1203 14:35:31.711581 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 03 14:35:32 crc kubenswrapper[4751]: E1203 14:35:32.702103 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 03 14:35:32 crc kubenswrapper[4751]: E1203 14:35:32.702273 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57q8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-l4676_openstack(6b52c852-92ed-47f2-8e47-a9ac1e378698): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:35:32 crc kubenswrapper[4751]: E1203 14:35:32.704421 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-l4676" podUID="6b52c852-92ed-47f2-8e47-a9ac1e378698" Dec 03 14:35:32 crc kubenswrapper[4751]: E1203 14:35:32.836736 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-l4676" podUID="6b52c852-92ed-47f2-8e47-a9ac1e378698" Dec 03 14:35:36 crc kubenswrapper[4751]: I1203 14:35:36.711248 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 03 14:35:36 crc kubenswrapper[4751]: I1203 14:35:36.711955 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:35:40 crc kubenswrapper[4751]: I1203 14:35:40.000740 4751 generic.go:334] "Generic (PLEG): container finished" podID="44d1508a-da0e-46be-9f1d-583e1be8d864" containerID="e5f89e275c7378fd8bc629e5f986d230a3cb73a1eb06035c5a68369abfcfa9f0" exitCode=0 Dec 03 14:35:40 crc kubenswrapper[4751]: I1203 14:35:40.002165 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g4glx" event={"ID":"44d1508a-da0e-46be-9f1d-583e1be8d864","Type":"ContainerDied","Data":"e5f89e275c7378fd8bc629e5f986d230a3cb73a1eb06035c5a68369abfcfa9f0"} Dec 03 14:35:40 crc kubenswrapper[4751]: I1203 14:35:40.102903 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:35:40 crc kubenswrapper[4751]: I1203 14:35:40.103022 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:35:40 crc kubenswrapper[4751]: I1203 14:35:40.123908 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:40 crc kubenswrapper[4751]: I1203 14:35:40.123968 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:41 crc kubenswrapper[4751]: I1203 14:35:41.710652 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.350632 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.357030 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.526629 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clrsj\" (UniqueName: \"kubernetes.io/projected/7885fce3-109e-41cb-8b8d-b2ce9200d036-kube-api-access-clrsj\") pod \"7885fce3-109e-41cb-8b8d-b2ce9200d036\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.526876 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"1835bd6b-042f-4df1-b692-544a2765e20c\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.526931 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-config-data\") pod \"1835bd6b-042f-4df1-b692-544a2765e20c\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.526956 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-combined-ca-bundle\") pod \"1835bd6b-042f-4df1-b692-544a2765e20c\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.526979 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-httpd-run\") pod \"1835bd6b-042f-4df1-b692-544a2765e20c\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527081 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"7885fce3-109e-41cb-8b8d-b2ce9200d036\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527111 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-internal-tls-certs\") pod \"7885fce3-109e-41cb-8b8d-b2ce9200d036\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527134 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-logs\") pod \"7885fce3-109e-41cb-8b8d-b2ce9200d036\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527180 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-public-tls-certs\") pod \"1835bd6b-042f-4df1-b692-544a2765e20c\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527208 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-scripts\") pod \"7885fce3-109e-41cb-8b8d-b2ce9200d036\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527230 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-config-data\") pod \"7885fce3-109e-41cb-8b8d-b2ce9200d036\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527275 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-httpd-run\") pod \"7885fce3-109e-41cb-8b8d-b2ce9200d036\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527301 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-combined-ca-bundle\") pod \"7885fce3-109e-41cb-8b8d-b2ce9200d036\" (UID: \"7885fce3-109e-41cb-8b8d-b2ce9200d036\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527503 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74cw9\" (UniqueName: \"kubernetes.io/projected/1835bd6b-042f-4df1-b692-544a2765e20c-kube-api-access-74cw9\") pod \"1835bd6b-042f-4df1-b692-544a2765e20c\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527529 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-logs\") pod \"1835bd6b-042f-4df1-b692-544a2765e20c\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.527574 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-scripts\") pod \"1835bd6b-042f-4df1-b692-544a2765e20c\" (UID: \"1835bd6b-042f-4df1-b692-544a2765e20c\") " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.528183 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-logs" (OuterVolumeSpecName: "logs") pod "7885fce3-109e-41cb-8b8d-b2ce9200d036" (UID: "7885fce3-109e-41cb-8b8d-b2ce9200d036"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.529143 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7885fce3-109e-41cb-8b8d-b2ce9200d036" (UID: "7885fce3-109e-41cb-8b8d-b2ce9200d036"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.529745 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1835bd6b-042f-4df1-b692-544a2765e20c" (UID: "1835bd6b-042f-4df1-b692-544a2765e20c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.530499 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-logs" (OuterVolumeSpecName: "logs") pod "1835bd6b-042f-4df1-b692-544a2765e20c" (UID: "1835bd6b-042f-4df1-b692-544a2765e20c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.533413 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7885fce3-109e-41cb-8b8d-b2ce9200d036-kube-api-access-clrsj" (OuterVolumeSpecName: "kube-api-access-clrsj") pod "7885fce3-109e-41cb-8b8d-b2ce9200d036" (UID: "7885fce3-109e-41cb-8b8d-b2ce9200d036"). InnerVolumeSpecName "kube-api-access-clrsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.538179 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-scripts" (OuterVolumeSpecName: "scripts") pod "1835bd6b-042f-4df1-b692-544a2765e20c" (UID: "1835bd6b-042f-4df1-b692-544a2765e20c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.540065 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1835bd6b-042f-4df1-b692-544a2765e20c-kube-api-access-74cw9" (OuterVolumeSpecName: "kube-api-access-74cw9") pod "1835bd6b-042f-4df1-b692-544a2765e20c" (UID: "1835bd6b-042f-4df1-b692-544a2765e20c"). InnerVolumeSpecName "kube-api-access-74cw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.551928 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-scripts" (OuterVolumeSpecName: "scripts") pod "7885fce3-109e-41cb-8b8d-b2ce9200d036" (UID: "7885fce3-109e-41cb-8b8d-b2ce9200d036"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.565013 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6" (OuterVolumeSpecName: "glance") pod "1835bd6b-042f-4df1-b692-544a2765e20c" (UID: "1835bd6b-042f-4df1-b692-544a2765e20c"). InnerVolumeSpecName "pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.590199 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1835bd6b-042f-4df1-b692-544a2765e20c" (UID: "1835bd6b-042f-4df1-b692-544a2765e20c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.592487 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308" (OuterVolumeSpecName: "glance") pod "7885fce3-109e-41cb-8b8d-b2ce9200d036" (UID: "7885fce3-109e-41cb-8b8d-b2ce9200d036"). InnerVolumeSpecName "pvc-cd3c1578-235e-47c7-b720-afd92ef00308". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.601578 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1835bd6b-042f-4df1-b692-544a2765e20c" (UID: "1835bd6b-042f-4df1-b692-544a2765e20c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.629587 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7885fce3-109e-41cb-8b8d-b2ce9200d036" (UID: "7885fce3-109e-41cb-8b8d-b2ce9200d036"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630142 4751 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630194 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630206 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630215 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74cw9\" (UniqueName: \"kubernetes.io/projected/1835bd6b-042f-4df1-b692-544a2765e20c-kube-api-access-74cw9\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630226 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630233 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630242 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clrsj\" (UniqueName: \"kubernetes.io/projected/7885fce3-109e-41cb-8b8d-b2ce9200d036-kube-api-access-clrsj\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630293 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") on node \"crc\" " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630305 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630316 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1835bd6b-042f-4df1-b692-544a2765e20c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630358 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") on node \"crc\" " Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630369 4751 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.630377 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7885fce3-109e-41cb-8b8d-b2ce9200d036-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.641458 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-config-data" (OuterVolumeSpecName: "config-data") pod "1835bd6b-042f-4df1-b692-544a2765e20c" (UID: "1835bd6b-042f-4df1-b692-544a2765e20c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.645453 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7885fce3-109e-41cb-8b8d-b2ce9200d036" (UID: "7885fce3-109e-41cb-8b8d-b2ce9200d036"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.646033 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-config-data" (OuterVolumeSpecName: "config-data") pod "7885fce3-109e-41cb-8b8d-b2ce9200d036" (UID: "7885fce3-109e-41cb-8b8d-b2ce9200d036"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.661774 4751 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.661957 4751 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-cd3c1578-235e-47c7-b720-afd92ef00308" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308") on node "crc" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.662946 4751 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.663052 4751 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6") on node "crc" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.732865 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.732904 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7885fce3-109e-41cb-8b8d-b2ce9200d036-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.732916 4751 reconciler_common.go:293] "Volume detached for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.732926 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1835bd6b-042f-4df1-b692-544a2765e20c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.732935 4751 reconciler_common.go:293] "Volume detached for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:42 crc kubenswrapper[4751]: I1203 14:35:42.921596 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.029698 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g4glx" event={"ID":"44d1508a-da0e-46be-9f1d-583e1be8d864","Type":"ContainerDied","Data":"f8e42fb04b082e54710eedbb16fe97a008c5e553272660c8982c1e4b72c51ba1"} Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.029981 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8e42fb04b082e54710eedbb16fe97a008c5e553272660c8982c1e4b72c51ba1" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.029729 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g4glx" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.031814 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1835bd6b-042f-4df1-b692-544a2765e20c","Type":"ContainerDied","Data":"2e9446c3ffd358ade23f81e2d72fbace7f2b98f911d259641028b8bbe7dbfd9b"} Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.031869 4751 scope.go:117] "RemoveContainer" containerID="93cae7b9d330d8f8e8787e04bcf651198522b823f752c4f8f57ffd7d422c0ad8" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.032037 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.040318 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-config\") pod \"44d1508a-da0e-46be-9f1d-583e1be8d864\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.040378 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-combined-ca-bundle\") pod \"44d1508a-da0e-46be-9f1d-583e1be8d864\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.040474 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm2hs\" (UniqueName: \"kubernetes.io/projected/44d1508a-da0e-46be-9f1d-583e1be8d864-kube-api-access-pm2hs\") pod \"44d1508a-da0e-46be-9f1d-583e1be8d864\" (UID: \"44d1508a-da0e-46be-9f1d-583e1be8d864\") " Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.045383 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d1508a-da0e-46be-9f1d-583e1be8d864-kube-api-access-pm2hs" (OuterVolumeSpecName: "kube-api-access-pm2hs") pod "44d1508a-da0e-46be-9f1d-583e1be8d864" (UID: "44d1508a-da0e-46be-9f1d-583e1be8d864"). InnerVolumeSpecName "kube-api-access-pm2hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.047359 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7885fce3-109e-41cb-8b8d-b2ce9200d036","Type":"ContainerDied","Data":"990714c95b91426dcbe678dd41bff790d5c9e603ec675c201ab4b2f10b466533"} Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.047397 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.080627 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.086418 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44d1508a-da0e-46be-9f1d-583e1be8d864" (UID: "44d1508a-da0e-46be-9f1d-583e1be8d864"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.086862 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-config" (OuterVolumeSpecName: "config") pod "44d1508a-da0e-46be-9f1d-583e1be8d864" (UID: "44d1508a-da0e-46be-9f1d-583e1be8d864"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.098849 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.112732 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:35:43 crc kubenswrapper[4751]: E1203 14:35:43.113215 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerName="glance-httpd" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113238 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerName="glance-httpd" Dec 03 14:35:43 crc kubenswrapper[4751]: E1203 14:35:43.113256 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d1508a-da0e-46be-9f1d-583e1be8d864" containerName="neutron-db-sync" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113265 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d1508a-da0e-46be-9f1d-583e1be8d864" containerName="neutron-db-sync" Dec 03 14:35:43 crc kubenswrapper[4751]: E1203 14:35:43.113286 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1835bd6b-042f-4df1-b692-544a2765e20c" containerName="glance-httpd" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113294 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="1835bd6b-042f-4df1-b692-544a2765e20c" containerName="glance-httpd" Dec 03 14:35:43 crc kubenswrapper[4751]: E1203 14:35:43.113309 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerName="glance-log" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113317 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerName="glance-log" Dec 03 14:35:43 crc kubenswrapper[4751]: E1203 14:35:43.113362 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1835bd6b-042f-4df1-b692-544a2765e20c" containerName="glance-log" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113370 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="1835bd6b-042f-4df1-b692-544a2765e20c" containerName="glance-log" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113580 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d1508a-da0e-46be-9f1d-583e1be8d864" containerName="neutron-db-sync" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113598 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="1835bd6b-042f-4df1-b692-544a2765e20c" containerName="glance-httpd" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113615 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerName="glance-httpd" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113630 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="1835bd6b-042f-4df1-b692-544a2765e20c" containerName="glance-log" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.113651 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7885fce3-109e-41cb-8b8d-b2ce9200d036" containerName="glance-log" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.117035 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.122382 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.122657 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.122770 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.122934 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zbls2" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.125957 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.137698 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.146785 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.146821 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d1508a-da0e-46be-9f1d-583e1be8d864-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.146839 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm2hs\" (UniqueName: \"kubernetes.io/projected/44d1508a-da0e-46be-9f1d-583e1be8d864-kube-api-access-pm2hs\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.161108 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.170774 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.173179 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.175532 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.176051 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.200590 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.251383 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-logs\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.251470 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.251610 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.251715 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpmq6\" (UniqueName: \"kubernetes.io/projected/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-kube-api-access-wpmq6\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.251824 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.251857 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-config-data\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.251888 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-scripts\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.251981 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.328299 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1835bd6b-042f-4df1-b692-544a2765e20c" path="/var/lib/kubelet/pods/1835bd6b-042f-4df1-b692-544a2765e20c/volumes" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.329008 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7885fce3-109e-41cb-8b8d-b2ce9200d036" path="/var/lib/kubelet/pods/7885fce3-109e-41cb-8b8d-b2ce9200d036/volumes" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.354074 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.354145 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.354166 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.354213 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.354242 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpmq6\" (UniqueName: \"kubernetes.io/projected/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-kube-api-access-wpmq6\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.354787 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.354840 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-config-data\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.354862 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-scripts\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.355399 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.355743 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.355882 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-logs\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.355899 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.355922 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.355958 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rtc\" (UniqueName: \"kubernetes.io/projected/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-kube-api-access-66rtc\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.355998 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.356072 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-logs\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.356274 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-logs\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.356515 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.359807 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.359845 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/683a5c301520fbc15dbc2ee54d0de9b6296d1615a3bef9c1741aa8a387a031dd/globalmount\"" pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.360627 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.363814 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-config-data\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.366195 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.374002 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpmq6\" (UniqueName: \"kubernetes.io/projected/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-kube-api-access-wpmq6\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.380253 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-scripts\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.403876 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.446001 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.457617 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.457677 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.457698 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rtc\" (UniqueName: \"kubernetes.io/projected/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-kube-api-access-66rtc\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.457763 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-logs\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.457811 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.457846 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.457874 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.458032 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.458246 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.458668 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-logs\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.465555 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.465597 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.465691 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.466173 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.466218 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.466419 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d3b26c55de8c52fd1f3bf024792f2005b72c5292706e21196d4a11b13179d08d/globalmount\"" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.478799 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rtc\" (UniqueName: \"kubernetes.io/projected/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-kube-api-access-66rtc\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.520655 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:35:43 crc kubenswrapper[4751]: I1203 14:35:43.799304 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.207247 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-46f56"] Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.209282 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.222382 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-46f56"] Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.296472 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qtpz\" (UniqueName: \"kubernetes.io/projected/0bc13bae-e4f7-49e3-9755-46e807f23efc-kube-api-access-6qtpz\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.296549 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-config\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.296591 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-svc\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.296620 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.296695 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.296725 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.385392 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-64fb975db4-zhnvr"] Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.387065 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.393784 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m4hm9" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.398634 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.402600 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.402691 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.402822 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.403029 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.403191 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qtpz\" (UniqueName: \"kubernetes.io/projected/0bc13bae-e4f7-49e3-9755-46e807f23efc-kube-api-access-6qtpz\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.403254 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-config\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.403362 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-svc\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.403381 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.403910 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.404537 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.404739 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-config\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.405811 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-svc\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.408767 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.411261 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64fb975db4-zhnvr"] Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.436180 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qtpz\" (UniqueName: \"kubernetes.io/projected/0bc13bae-e4f7-49e3-9755-46e807f23efc-kube-api-access-6qtpz\") pod \"dnsmasq-dns-6b7b667979-46f56\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.504414 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-ovndb-tls-certs\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.504487 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-combined-ca-bundle\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.504515 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-config\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.504548 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62jsr\" (UniqueName: \"kubernetes.io/projected/90ae9890-60ca-4467-910c-a0d7459b189d-kube-api-access-62jsr\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.504583 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-httpd-config\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.538819 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.607087 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-httpd-config\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.607412 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-ovndb-tls-certs\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.607476 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-combined-ca-bundle\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.607507 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-config\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.607545 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62jsr\" (UniqueName: \"kubernetes.io/projected/90ae9890-60ca-4467-910c-a0d7459b189d-kube-api-access-62jsr\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.615387 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-combined-ca-bundle\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.618013 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-ovndb-tls-certs\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.622394 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-httpd-config\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.623051 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-config\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.629082 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62jsr\" (UniqueName: \"kubernetes.io/projected/90ae9890-60ca-4467-910c-a0d7459b189d-kube-api-access-62jsr\") pod \"neutron-64fb975db4-zhnvr\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:44 crc kubenswrapper[4751]: I1203 14:35:44.716651 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:35:45 crc kubenswrapper[4751]: E1203 14:35:45.176995 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 03 14:35:45 crc kubenswrapper[4751]: E1203 14:35:45.177154 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjdz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-5qnpr_openstack(4784bf8d-4315-4097-b729-1f21940a17bc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:35:45 crc kubenswrapper[4751]: E1203 14:35:45.178275 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-5qnpr" podUID="4784bf8d-4315-4097-b729-1f21940a17bc" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.187427 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.218719 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-dns-svc\") pod \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.218823 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-nb\") pod \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.218881 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfvjx\" (UniqueName: \"kubernetes.io/projected/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-kube-api-access-pfvjx\") pod \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.244636 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-kube-api-access-pfvjx" (OuterVolumeSpecName: "kube-api-access-pfvjx") pod "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" (UID: "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58"). InnerVolumeSpecName "kube-api-access-pfvjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.313389 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" (UID: "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.316532 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" (UID: "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.320128 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-config\") pod \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.320226 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-sb\") pod \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\" (UID: \"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58\") " Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.320985 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.321066 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.321129 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfvjx\" (UniqueName: \"kubernetes.io/projected/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-kube-api-access-pfvjx\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.401289 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-config" (OuterVolumeSpecName: "config") pod "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" (UID: "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.411907 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" (UID: "791ff53a-abb1-4fc7-bb6a-59e11d2b1b58"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.422803 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:45 crc kubenswrapper[4751]: I1203 14:35:45.422838 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.083173 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" event={"ID":"791ff53a-abb1-4fc7-bb6a-59e11d2b1b58","Type":"ContainerDied","Data":"e6912e5ffe96e9119992697d67de9f17890129c6f3a90b5e96fed31cb03dbd42"} Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.083205 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-jdm6k" Dec 03 14:35:46 crc kubenswrapper[4751]: E1203 14:35:46.085975 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-5qnpr" podUID="4784bf8d-4315-4097-b729-1f21940a17bc" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.131705 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jdm6k"] Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.146850 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jdm6k"] Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.516656 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bc6669df7-xxpxz"] Dec 03 14:35:46 crc kubenswrapper[4751]: E1203 14:35:46.517540 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="dnsmasq-dns" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.517555 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="dnsmasq-dns" Dec 03 14:35:46 crc kubenswrapper[4751]: E1203 14:35:46.517569 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="init" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.517575 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="init" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.517758 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" containerName="dnsmasq-dns" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.519107 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.526442 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.526546 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.534992 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bc6669df7-xxpxz"] Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.657970 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-httpd-config\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.658054 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-internal-tls-certs\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.658102 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-ovndb-tls-certs\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.658269 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-config\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.658372 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tj28\" (UniqueName: \"kubernetes.io/projected/97f090c9-1ba2-45b8-9f01-c8372381b095-kube-api-access-8tj28\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.658393 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-public-tls-certs\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.658461 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-combined-ca-bundle\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.760583 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tj28\" (UniqueName: \"kubernetes.io/projected/97f090c9-1ba2-45b8-9f01-c8372381b095-kube-api-access-8tj28\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.760637 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-public-tls-certs\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.760707 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-combined-ca-bundle\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.760751 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-httpd-config\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.760782 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-internal-tls-certs\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.760815 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-ovndb-tls-certs\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.760909 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-config\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.772339 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-internal-tls-certs\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.772447 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-httpd-config\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.773297 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-public-tls-certs\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.774321 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-config\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.790163 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tj28\" (UniqueName: \"kubernetes.io/projected/97f090c9-1ba2-45b8-9f01-c8372381b095-kube-api-access-8tj28\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.790398 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-ovndb-tls-certs\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.797839 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f090c9-1ba2-45b8-9f01-c8372381b095-combined-ca-bundle\") pod \"neutron-7bc6669df7-xxpxz\" (UID: \"97f090c9-1ba2-45b8-9f01-c8372381b095\") " pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:46 crc kubenswrapper[4751]: I1203 14:35:46.841014 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:47 crc kubenswrapper[4751]: I1203 14:35:47.328963 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="791ff53a-abb1-4fc7-bb6a-59e11d2b1b58" path="/var/lib/kubelet/pods/791ff53a-abb1-4fc7-bb6a-59e11d2b1b58/volumes" Dec 03 14:35:52 crc kubenswrapper[4751]: I1203 14:35:52.298076 4751 scope.go:117] "RemoveContainer" containerID="3e4d2f0875cd00f62dab0ed1c96bd3acd358a969220a9764cb5c0978471bf2f7" Dec 03 14:35:52 crc kubenswrapper[4751]: I1203 14:35:52.904956 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7kzxp"] Dec 03 14:35:55 crc kubenswrapper[4751]: W1203 14:35:55.409459 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde5612b5_9ee0_4da5_84b0_402ce8b1a163.slice/crio-8793b721e14dbffbf3e4b10caf7278e03a8c0282a5064c3b6b3777d6fc8ed753 WatchSource:0}: Error finding container 8793b721e14dbffbf3e4b10caf7278e03a8c0282a5064c3b6b3777d6fc8ed753: Status 404 returned error can't find the container with id 8793b721e14dbffbf3e4b10caf7278e03a8c0282a5064c3b6b3777d6fc8ed753 Dec 03 14:35:55 crc kubenswrapper[4751]: E1203 14:35:55.451234 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Dec 03 14:35:55 crc kubenswrapper[4751]: E1203 14:35:55.451291 4751 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Dec 03 14:35:55 crc kubenswrapper[4751]: E1203 14:35:55.451435 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zcplz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-bgs8x_openstack(4df7d14f-4a52-43be-9877-c5df9c015cc7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:35:55 crc kubenswrapper[4751]: E1203 14:35:55.452580 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-bgs8x" podUID="4df7d14f-4a52-43be-9877-c5df9c015cc7" Dec 03 14:35:55 crc kubenswrapper[4751]: I1203 14:35:55.491155 4751 scope.go:117] "RemoveContainer" containerID="89bb5175d76c1f666da9960b1defa85471308461a788d486f9542dbdebad378d" Dec 03 14:35:55 crc kubenswrapper[4751]: I1203 14:35:55.696072 4751 scope.go:117] "RemoveContainer" containerID="c2c84c98688cfa061a82e7f8c31682383dfff608f36cb82696022a2e4b6a0528" Dec 03 14:35:55 crc kubenswrapper[4751]: I1203 14:35:55.761502 4751 scope.go:117] "RemoveContainer" containerID="6c2e401c92d038de3a392d5e003f097d04bdb2900597efb2d2acdade9546d339" Dec 03 14:35:55 crc kubenswrapper[4751]: I1203 14:35:55.800065 4751 scope.go:117] "RemoveContainer" containerID="d5455fd4736f3358c46f6726abb1d727db7e38e6bf0a356841dab5b6d1ea12b4" Dec 03 14:35:55 crc kubenswrapper[4751]: I1203 14:35:55.940784 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:35:55 crc kubenswrapper[4751]: I1203 14:35:55.948426 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-46f56"] Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.141522 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:35:56 crc kubenswrapper[4751]: W1203 14:35:56.166182 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d746a7a_b7c6_47d1_b4cf_6e424fb565d7.slice/crio-cf3d3bba20dc3e8746e50cfb9c67342a389c06688b9bd339e35820fdfd22d702 WatchSource:0}: Error finding container cf3d3bba20dc3e8746e50cfb9c67342a389c06688b9bd339e35820fdfd22d702: Status 404 returned error can't find the container with id cf3d3bba20dc3e8746e50cfb9c67342a389c06688b9bd339e35820fdfd22d702 Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.242215 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-46f56" event={"ID":"0bc13bae-e4f7-49e3-9755-46e807f23efc","Type":"ContainerStarted","Data":"5cf1b16da5e5690acf3447f4d02fe2592b436cc14a3a4560715cfb032c662bed"} Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.242257 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-46f56" event={"ID":"0bc13bae-e4f7-49e3-9755-46e807f23efc","Type":"ContainerStarted","Data":"22d787fff3a5fccf5753b2d9579a6e6e112041d6dce988ef6107d9d65a995222"} Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.247066 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wbwn4" event={"ID":"cbe09550-cc72-4fe9-af45-b39fbcac540d","Type":"ContainerStarted","Data":"99ed7eb161b12788b367b86c391e8674f70f3493789083d8cd49b477a968c1cc"} Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.249315 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bc6669df7-xxpxz"] Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.264915 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l4676" event={"ID":"6b52c852-92ed-47f2-8e47-a9ac1e378698","Type":"ContainerStarted","Data":"dcf778ddadc5f54d3034973dcacf17fd0a1ff3587c86cf599022d0cd3691425f"} Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.273143 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerStarted","Data":"76e8fcf8f19c4ff5ea6a00e922893fc2693c320e5ba8e08a5908d288add03d4f"} Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.276249 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7","Type":"ContainerStarted","Data":"cf3d3bba20dc3e8746e50cfb9c67342a389c06688b9bd339e35820fdfd22d702"} Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.293031 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wbwn4" podStartSLOduration=10.302321063 podStartE2EDuration="48.293010334s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="2025-12-03 14:35:10.395094476 +0000 UTC m=+1317.383449693" lastFinishedPulling="2025-12-03 14:35:48.385783747 +0000 UTC m=+1355.374138964" observedRunningTime="2025-12-03 14:35:56.278159139 +0000 UTC m=+1363.266514356" watchObservedRunningTime="2025-12-03 14:35:56.293010334 +0000 UTC m=+1363.281365551" Dec 03 14:35:56 crc kubenswrapper[4751]: W1203 14:35:56.297828 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97f090c9_1ba2_45b8_9f01_c8372381b095.slice/crio-e1634ec6e32ce4d597b75245d87df390533a22199717636671246fcb78de8731 WatchSource:0}: Error finding container e1634ec6e32ce4d597b75245d87df390533a22199717636671246fcb78de8731: Status 404 returned error can't find the container with id e1634ec6e32ce4d597b75245d87df390533a22199717636671246fcb78de8731 Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.303991 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kzxp" event={"ID":"de5612b5-9ee0-4da5-84b0-402ce8b1a163","Type":"ContainerStarted","Data":"3e4ed144a1aee447944ebfd6b9128868ad08ae8df64f5ba6dcb3cc573652eeeb"} Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.304028 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kzxp" event={"ID":"de5612b5-9ee0-4da5-84b0-402ce8b1a163","Type":"ContainerStarted","Data":"8793b721e14dbffbf3e4b10caf7278e03a8c0282a5064c3b6b3777d6fc8ed753"} Dec 03 14:35:56 crc kubenswrapper[4751]: E1203 14:35:56.306456 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-bgs8x" podUID="4df7d14f-4a52-43be-9877-c5df9c015cc7" Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.318868 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-l4676" podStartSLOduration=3.7779010680000003 podStartE2EDuration="48.318848248s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="2025-12-03 14:35:10.991695188 +0000 UTC m=+1317.980050405" lastFinishedPulling="2025-12-03 14:35:55.532642368 +0000 UTC m=+1362.520997585" observedRunningTime="2025-12-03 14:35:56.298416811 +0000 UTC m=+1363.286772048" watchObservedRunningTime="2025-12-03 14:35:56.318848248 +0000 UTC m=+1363.307203465" Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.380407 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.386291 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7kzxp" podStartSLOduration=28.386268826 podStartE2EDuration="28.386268826s" podCreationTimestamp="2025-12-03 14:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:56.357923193 +0000 UTC m=+1363.346278410" watchObservedRunningTime="2025-12-03 14:35:56.386268826 +0000 UTC m=+1363.374624043" Dec 03 14:35:56 crc kubenswrapper[4751]: I1203 14:35:56.813128 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64fb975db4-zhnvr"] Dec 03 14:35:56 crc kubenswrapper[4751]: W1203 14:35:56.836140 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90ae9890_60ca_4467_910c_a0d7459b189d.slice/crio-3ca3081b173f0212f9e4cd608bcaad8bbedf8591db454971c5d63f45ab290861 WatchSource:0}: Error finding container 3ca3081b173f0212f9e4cd608bcaad8bbedf8591db454971c5d63f45ab290861: Status 404 returned error can't find the container with id 3ca3081b173f0212f9e4cd608bcaad8bbedf8591db454971c5d63f45ab290861 Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.323591 4751 generic.go:334] "Generic (PLEG): container finished" podID="0bc13bae-e4f7-49e3-9755-46e807f23efc" containerID="5cf1b16da5e5690acf3447f4d02fe2592b436cc14a3a4560715cfb032c662bed" exitCode=0 Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.326910 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a","Type":"ContainerStarted","Data":"e8267190d2ccb384f28b2eb880c31fbb45afb0c63eabbaa7e67326fe03fd2486"} Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.326966 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7","Type":"ContainerStarted","Data":"8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060"} Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.327023 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-46f56" event={"ID":"0bc13bae-e4f7-49e3-9755-46e807f23efc","Type":"ContainerDied","Data":"5cf1b16da5e5690acf3447f4d02fe2592b436cc14a3a4560715cfb032c662bed"} Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.328564 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bc6669df7-xxpxz" event={"ID":"97f090c9-1ba2-45b8-9f01-c8372381b095","Type":"ContainerStarted","Data":"7b1ce6eddbc2fd567efb5426f3a46a9ccbd268608dd8035c20803a2917ed1031"} Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.328616 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bc6669df7-xxpxz" event={"ID":"97f090c9-1ba2-45b8-9f01-c8372381b095","Type":"ContainerStarted","Data":"1a85e0a334e8e2f073a2eda6726ef25fdbf0beddd0e7fc52c889e2ce6a42a6ba"} Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.328630 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bc6669df7-xxpxz" event={"ID":"97f090c9-1ba2-45b8-9f01-c8372381b095","Type":"ContainerStarted","Data":"e1634ec6e32ce4d597b75245d87df390533a22199717636671246fcb78de8731"} Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.329634 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.332227 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64fb975db4-zhnvr" event={"ID":"90ae9890-60ca-4467-910c-a0d7459b189d","Type":"ContainerStarted","Data":"f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597"} Dec 03 14:35:57 crc kubenswrapper[4751]: I1203 14:35:57.332283 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64fb975db4-zhnvr" event={"ID":"90ae9890-60ca-4467-910c-a0d7459b189d","Type":"ContainerStarted","Data":"3ca3081b173f0212f9e4cd608bcaad8bbedf8591db454971c5d63f45ab290861"} Dec 03 14:35:58 crc kubenswrapper[4751]: I1203 14:35:58.336816 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bc6669df7-xxpxz" podStartSLOduration=12.336792451 podStartE2EDuration="12.336792451s" podCreationTimestamp="2025-12-03 14:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:35:57.372645482 +0000 UTC m=+1364.361000699" watchObservedRunningTime="2025-12-03 14:35:58.336792451 +0000 UTC m=+1365.325147668" Dec 03 14:35:58 crc kubenswrapper[4751]: I1203 14:35:58.372450 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a","Type":"ContainerStarted","Data":"e1ca1f4fb94804711e025ec4962bd4c9fbddd9e59a5f7453aed42cc7775cf35d"} Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.192899 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8svh4"] Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.196409 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.207006 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8svh4"] Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.308551 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-catalog-content\") pod \"redhat-marketplace-8svh4\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.308717 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf8dd\" (UniqueName: \"kubernetes.io/projected/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-kube-api-access-tf8dd\") pod \"redhat-marketplace-8svh4\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.309114 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-utilities\") pod \"redhat-marketplace-8svh4\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.410668 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf8dd\" (UniqueName: \"kubernetes.io/projected/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-kube-api-access-tf8dd\") pod \"redhat-marketplace-8svh4\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.410876 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-utilities\") pod \"redhat-marketplace-8svh4\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.410961 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-catalog-content\") pod \"redhat-marketplace-8svh4\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.411462 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-utilities\") pod \"redhat-marketplace-8svh4\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.411486 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-catalog-content\") pod \"redhat-marketplace-8svh4\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.430780 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf8dd\" (UniqueName: \"kubernetes.io/projected/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-kube-api-access-tf8dd\") pod \"redhat-marketplace-8svh4\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:01 crc kubenswrapper[4751]: I1203 14:36:01.515110 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:02 crc kubenswrapper[4751]: I1203 14:36:02.421402 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64fb975db4-zhnvr" event={"ID":"90ae9890-60ca-4467-910c-a0d7459b189d","Type":"ContainerStarted","Data":"cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687"} Dec 03 14:36:02 crc kubenswrapper[4751]: I1203 14:36:02.421850 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:36:02 crc kubenswrapper[4751]: I1203 14:36:02.450598 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-64fb975db4-zhnvr" podStartSLOduration=18.450563248 podStartE2EDuration="18.450563248s" podCreationTimestamp="2025-12-03 14:35:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:02.440588396 +0000 UTC m=+1369.428943613" watchObservedRunningTime="2025-12-03 14:36:02.450563248 +0000 UTC m=+1369.438918465" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.192085 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dkx6d"] Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.195486 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.212259 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dkx6d"] Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.275104 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-catalog-content\") pod \"certified-operators-dkx6d\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.275285 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-utilities\") pod \"certified-operators-dkx6d\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.275368 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqssn\" (UniqueName: \"kubernetes.io/projected/4241fb0b-9af8-430d-a244-84a16d0679e2-kube-api-access-zqssn\") pod \"certified-operators-dkx6d\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.377547 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-catalog-content\") pod \"certified-operators-dkx6d\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.377650 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-utilities\") pod \"certified-operators-dkx6d\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.377694 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqssn\" (UniqueName: \"kubernetes.io/projected/4241fb0b-9af8-430d-a244-84a16d0679e2-kube-api-access-zqssn\") pod \"certified-operators-dkx6d\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.378219 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-catalog-content\") pod \"certified-operators-dkx6d\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.378354 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-utilities\") pod \"certified-operators-dkx6d\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.399785 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqssn\" (UniqueName: \"kubernetes.io/projected/4241fb0b-9af8-430d-a244-84a16d0679e2-kube-api-access-zqssn\") pod \"certified-operators-dkx6d\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.443920 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-46f56" event={"ID":"0bc13bae-e4f7-49e3-9755-46e807f23efc","Type":"ContainerStarted","Data":"a5d2cf4ddcb1ea926592f27cd654fa69e80a380a65308fd335d8d3df7ee85403"} Dec 03 14:36:04 crc kubenswrapper[4751]: I1203 14:36:04.529647 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:05 crc kubenswrapper[4751]: I1203 14:36:05.392386 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8svh4"] Dec 03 14:36:05 crc kubenswrapper[4751]: I1203 14:36:05.589709 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dkx6d"] Dec 03 14:36:06 crc kubenswrapper[4751]: I1203 14:36:06.497937 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a","Type":"ContainerStarted","Data":"7b8f252ac2252a13c2b95e00ae25166e9c02cad0ddfa1fb84db0cb35d58a0579"} Dec 03 14:36:06 crc kubenswrapper[4751]: I1203 14:36:06.508564 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7","Type":"ContainerStarted","Data":"213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11"} Dec 03 14:36:06 crc kubenswrapper[4751]: I1203 14:36:06.508735 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:36:06 crc kubenswrapper[4751]: I1203 14:36:06.527208 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=23.527179744 podStartE2EDuration="23.527179744s" podCreationTimestamp="2025-12-03 14:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:06.526609609 +0000 UTC m=+1373.514964846" watchObservedRunningTime="2025-12-03 14:36:06.527179744 +0000 UTC m=+1373.515534971" Dec 03 14:36:06 crc kubenswrapper[4751]: I1203 14:36:06.553755 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-46f56" podStartSLOduration=22.553737068 podStartE2EDuration="22.553737068s" podCreationTimestamp="2025-12-03 14:35:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:06.54352132 +0000 UTC m=+1373.531876537" watchObservedRunningTime="2025-12-03 14:36:06.553737068 +0000 UTC m=+1373.542092285" Dec 03 14:36:06 crc kubenswrapper[4751]: I1203 14:36:06.565400 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=23.565378275 podStartE2EDuration="23.565378275s" podCreationTimestamp="2025-12-03 14:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:06.562660411 +0000 UTC m=+1373.551015628" watchObservedRunningTime="2025-12-03 14:36:06.565378275 +0000 UTC m=+1373.553733492" Dec 03 14:36:07 crc kubenswrapper[4751]: W1203 14:36:07.548921 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf27f9b_b0e5_4b2a_af11_a6b6251374bd.slice/crio-bb25b274ede8ad8f40274483c4d892e45061d70c127a4b14d220849396d7aece WatchSource:0}: Error finding container bb25b274ede8ad8f40274483c4d892e45061d70c127a4b14d220849396d7aece: Status 404 returned error can't find the container with id bb25b274ede8ad8f40274483c4d892e45061d70c127a4b14d220849396d7aece Dec 03 14:36:07 crc kubenswrapper[4751]: W1203 14:36:07.550172 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4241fb0b_9af8_430d_a244_84a16d0679e2.slice/crio-4e86b206bfea513c6a48a03ed016075e965a0cd75c4eb14b3f1a5a4d2e9b71ee WatchSource:0}: Error finding container 4e86b206bfea513c6a48a03ed016075e965a0cd75c4eb14b3f1a5a4d2e9b71ee: Status 404 returned error can't find the container with id 4e86b206bfea513c6a48a03ed016075e965a0cd75c4eb14b3f1a5a4d2e9b71ee Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.528043 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-bgs8x" event={"ID":"4df7d14f-4a52-43be-9877-c5df9c015cc7","Type":"ContainerStarted","Data":"f74eebf05afb03acb40dab92e5dd6d14e9c474658599c36667cf2b6599a9e351"} Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.529943 4751 generic.go:334] "Generic (PLEG): container finished" podID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerID="e1837ff5c8ea3f9b5d5627900a3591bb075ffb12d0114d378fd2795158400fb6" exitCode=0 Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.529981 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkx6d" event={"ID":"4241fb0b-9af8-430d-a244-84a16d0679e2","Type":"ContainerDied","Data":"e1837ff5c8ea3f9b5d5627900a3591bb075ffb12d0114d378fd2795158400fb6"} Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.530007 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkx6d" event={"ID":"4241fb0b-9af8-430d-a244-84a16d0679e2","Type":"ContainerStarted","Data":"4e86b206bfea513c6a48a03ed016075e965a0cd75c4eb14b3f1a5a4d2e9b71ee"} Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.531298 4751 generic.go:334] "Generic (PLEG): container finished" podID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerID="d66a018b9aa11ce7875e010546612fdc17aab858f72aa637ba7683da83b8a777" exitCode=0 Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.531366 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8svh4" event={"ID":"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd","Type":"ContainerDied","Data":"d66a018b9aa11ce7875e010546612fdc17aab858f72aa637ba7683da83b8a777"} Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.531399 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8svh4" event={"ID":"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd","Type":"ContainerStarted","Data":"bb25b274ede8ad8f40274483c4d892e45061d70c127a4b14d220849396d7aece"} Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.534878 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerStarted","Data":"02f12f7ea14c7367523ffa5e2fa5161c5c6dd54d4b09459fdbdce4fb40524372"} Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.544482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5qnpr" event={"ID":"4784bf8d-4315-4097-b729-1f21940a17bc","Type":"ContainerStarted","Data":"1c63976e3db62586993d795217316bd30f339b8441e6b64054a5db80e8519f5c"} Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.559109 4751 generic.go:334] "Generic (PLEG): container finished" podID="de5612b5-9ee0-4da5-84b0-402ce8b1a163" containerID="3e4ed144a1aee447944ebfd6b9128868ad08ae8df64f5ba6dcb3cc573652eeeb" exitCode=0 Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.559153 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kzxp" event={"ID":"de5612b5-9ee0-4da5-84b0-402ce8b1a163","Type":"ContainerDied","Data":"3e4ed144a1aee447944ebfd6b9128868ad08ae8df64f5ba6dcb3cc573652eeeb"} Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.559818 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-bgs8x" podStartSLOduration=3.793632167 podStartE2EDuration="1m0.559805288s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="2025-12-03 14:35:10.991220815 +0000 UTC m=+1317.979576032" lastFinishedPulling="2025-12-03 14:36:07.757393936 +0000 UTC m=+1374.745749153" observedRunningTime="2025-12-03 14:36:08.549349213 +0000 UTC m=+1375.537704440" watchObservedRunningTime="2025-12-03 14:36:08.559805288 +0000 UTC m=+1375.548160505" Dec 03 14:36:08 crc kubenswrapper[4751]: I1203 14:36:08.610766 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-5qnpr" podStartSLOduration=3.3005000349999998 podStartE2EDuration="1m0.610744016s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="2025-12-03 14:35:10.284629055 +0000 UTC m=+1317.272984272" lastFinishedPulling="2025-12-03 14:36:07.594873016 +0000 UTC m=+1374.583228253" observedRunningTime="2025-12-03 14:36:08.596915109 +0000 UTC m=+1375.585270336" watchObservedRunningTime="2025-12-03 14:36:08.610744016 +0000 UTC m=+1375.599099233" Dec 03 14:36:09 crc kubenswrapper[4751]: I1203 14:36:09.541477 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:36:09 crc kubenswrapper[4751]: I1203 14:36:09.603115 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mj9pc"] Dec 03 14:36:09 crc kubenswrapper[4751]: I1203 14:36:09.603343 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" podUID="276feef5-736d-4d00-a9af-e81eaf2a0285" containerName="dnsmasq-dns" containerID="cri-o://057d49ea4b08ade87c3d87bdb552c83904d430fa181aecafdc11fb3f9da399bb" gracePeriod=10 Dec 03 14:36:09 crc kubenswrapper[4751]: I1203 14:36:09.633135 4751 generic.go:334] "Generic (PLEG): container finished" podID="6b52c852-92ed-47f2-8e47-a9ac1e378698" containerID="dcf778ddadc5f54d3034973dcacf17fd0a1ff3587c86cf599022d0cd3691425f" exitCode=0 Dec 03 14:36:09 crc kubenswrapper[4751]: I1203 14:36:09.633270 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l4676" event={"ID":"6b52c852-92ed-47f2-8e47-a9ac1e378698","Type":"ContainerDied","Data":"dcf778ddadc5f54d3034973dcacf17fd0a1ff3587c86cf599022d0cd3691425f"} Dec 03 14:36:09 crc kubenswrapper[4751]: I1203 14:36:09.659374 4751 generic.go:334] "Generic (PLEG): container finished" podID="cbe09550-cc72-4fe9-af45-b39fbcac540d" containerID="99ed7eb161b12788b367b86c391e8674f70f3493789083d8cd49b477a968c1cc" exitCode=0 Dec 03 14:36:09 crc kubenswrapper[4751]: I1203 14:36:09.659416 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wbwn4" event={"ID":"cbe09550-cc72-4fe9-af45-b39fbcac540d","Type":"ContainerDied","Data":"99ed7eb161b12788b367b86c391e8674f70f3493789083d8cd49b477a968c1cc"} Dec 03 14:36:10 crc kubenswrapper[4751]: I1203 14:36:10.680778 4751 generic.go:334] "Generic (PLEG): container finished" podID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerID="b8cf759ce2b56229818e53e6bdf44f78ad50e3cfc5557e1fa89f1d693f981415" exitCode=0 Dec 03 14:36:10 crc kubenswrapper[4751]: I1203 14:36:10.680954 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkx6d" event={"ID":"4241fb0b-9af8-430d-a244-84a16d0679e2","Type":"ContainerDied","Data":"b8cf759ce2b56229818e53e6bdf44f78ad50e3cfc5557e1fa89f1d693f981415"} Dec 03 14:36:10 crc kubenswrapper[4751]: I1203 14:36:10.689108 4751 generic.go:334] "Generic (PLEG): container finished" podID="276feef5-736d-4d00-a9af-e81eaf2a0285" containerID="057d49ea4b08ade87c3d87bdb552c83904d430fa181aecafdc11fb3f9da399bb" exitCode=0 Dec 03 14:36:10 crc kubenswrapper[4751]: I1203 14:36:10.689189 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" event={"ID":"276feef5-736d-4d00-a9af-e81eaf2a0285","Type":"ContainerDied","Data":"057d49ea4b08ade87c3d87bdb552c83904d430fa181aecafdc11fb3f9da399bb"} Dec 03 14:36:10 crc kubenswrapper[4751]: I1203 14:36:10.693478 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8svh4" event={"ID":"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd","Type":"ContainerDied","Data":"a27d8a74d017a8dd04a9d7f039b9182d4448b513b259336e7897cfd618ee8bda"} Dec 03 14:36:10 crc kubenswrapper[4751]: I1203 14:36:10.693167 4751 generic.go:334] "Generic (PLEG): container finished" podID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerID="a27d8a74d017a8dd04a9d7f039b9182d4448b513b259336e7897cfd618ee8bda" exitCode=0 Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.447624 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.448302 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.448316 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.448338 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.494861 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.500575 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.584922 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.598174 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l4676" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.603433 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.717983 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-config\") pod \"276feef5-736d-4d00-a9af-e81eaf2a0285\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718388 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-config-data\") pod \"6b52c852-92ed-47f2-8e47-a9ac1e378698\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718429 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-scripts\") pod \"6b52c852-92ed-47f2-8e47-a9ac1e378698\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718501 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57q8w\" (UniqueName: \"kubernetes.io/projected/6b52c852-92ed-47f2-8e47-a9ac1e378698-kube-api-access-57q8w\") pod \"6b52c852-92ed-47f2-8e47-a9ac1e378698\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718538 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-swift-storage-0\") pod \"276feef5-736d-4d00-a9af-e81eaf2a0285\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718568 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-combined-ca-bundle\") pod \"6b52c852-92ed-47f2-8e47-a9ac1e378698\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718591 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hcvs\" (UniqueName: \"kubernetes.io/projected/276feef5-736d-4d00-a9af-e81eaf2a0285-kube-api-access-7hcvs\") pod \"276feef5-736d-4d00-a9af-e81eaf2a0285\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718609 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w52mq\" (UniqueName: \"kubernetes.io/projected/cbe09550-cc72-4fe9-af45-b39fbcac540d-kube-api-access-w52mq\") pod \"cbe09550-cc72-4fe9-af45-b39fbcac540d\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718651 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-db-sync-config-data\") pod \"cbe09550-cc72-4fe9-af45-b39fbcac540d\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718678 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-combined-ca-bundle\") pod \"cbe09550-cc72-4fe9-af45-b39fbcac540d\" (UID: \"cbe09550-cc72-4fe9-af45-b39fbcac540d\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718743 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b52c852-92ed-47f2-8e47-a9ac1e378698-logs\") pod \"6b52c852-92ed-47f2-8e47-a9ac1e378698\" (UID: \"6b52c852-92ed-47f2-8e47-a9ac1e378698\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718780 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-svc\") pod \"276feef5-736d-4d00-a9af-e81eaf2a0285\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718828 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-sb\") pod \"276feef5-736d-4d00-a9af-e81eaf2a0285\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.718850 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-nb\") pod \"276feef5-736d-4d00-a9af-e81eaf2a0285\" (UID: \"276feef5-736d-4d00-a9af-e81eaf2a0285\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.725973 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b52c852-92ed-47f2-8e47-a9ac1e378698-logs" (OuterVolumeSpecName: "logs") pod "6b52c852-92ed-47f2-8e47-a9ac1e378698" (UID: "6b52c852-92ed-47f2-8e47-a9ac1e378698"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.733148 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cbe09550-cc72-4fe9-af45-b39fbcac540d" (UID: "cbe09550-cc72-4fe9-af45-b39fbcac540d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.740623 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b52c852-92ed-47f2-8e47-a9ac1e378698-kube-api-access-57q8w" (OuterVolumeSpecName: "kube-api-access-57q8w") pod "6b52c852-92ed-47f2-8e47-a9ac1e378698" (UID: "6b52c852-92ed-47f2-8e47-a9ac1e378698"). InnerVolumeSpecName "kube-api-access-57q8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.740925 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/276feef5-736d-4d00-a9af-e81eaf2a0285-kube-api-access-7hcvs" (OuterVolumeSpecName: "kube-api-access-7hcvs") pod "276feef5-736d-4d00-a9af-e81eaf2a0285" (UID: "276feef5-736d-4d00-a9af-e81eaf2a0285"). InnerVolumeSpecName "kube-api-access-7hcvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.744671 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe09550-cc72-4fe9-af45-b39fbcac540d-kube-api-access-w52mq" (OuterVolumeSpecName: "kube-api-access-w52mq") pod "cbe09550-cc72-4fe9-af45-b39fbcac540d" (UID: "cbe09550-cc72-4fe9-af45-b39fbcac540d"). InnerVolumeSpecName "kube-api-access-w52mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.754342 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.760631 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kzxp" event={"ID":"de5612b5-9ee0-4da5-84b0-402ce8b1a163","Type":"ContainerDied","Data":"8793b721e14dbffbf3e4b10caf7278e03a8c0282a5064c3b6b3777d6fc8ed753"} Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.760687 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8793b721e14dbffbf3e4b10caf7278e03a8c0282a5064c3b6b3777d6fc8ed753" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.760682 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kzxp" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.772669 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-scripts" (OuterVolumeSpecName: "scripts") pod "6b52c852-92ed-47f2-8e47-a9ac1e378698" (UID: "6b52c852-92ed-47f2-8e47-a9ac1e378698"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.778780 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wbwn4" event={"ID":"cbe09550-cc72-4fe9-af45-b39fbcac540d","Type":"ContainerDied","Data":"85728848ee55c4a068dc3edb62bcd0dfa0363666e541a0b4ce8801210725d820"} Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.778861 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85728848ee55c4a068dc3edb62bcd0dfa0363666e541a0b4ce8801210725d820" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.778995 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wbwn4" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.786230 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.786205 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mj9pc" event={"ID":"276feef5-736d-4d00-a9af-e81eaf2a0285","Type":"ContainerDied","Data":"6b787dd6f7139cc6c338e83681dd76c71327c3c40b4fe4d5404a7ae362465c60"} Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.786346 4751 scope.go:117] "RemoveContainer" containerID="057d49ea4b08ade87c3d87bdb552c83904d430fa181aecafdc11fb3f9da399bb" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.791969 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l4676" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.792436 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l4676" event={"ID":"6b52c852-92ed-47f2-8e47-a9ac1e378698","Type":"ContainerDied","Data":"a681677b3b4ecb206685fb244b1d6346ed40766228d6b26c74bc7b7ff37acb94"} Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.792497 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a681677b3b4ecb206685fb244b1d6346ed40766228d6b26c74bc7b7ff37acb94" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.799517 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.810081 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.810242 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.810254 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.821758 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrfh2\" (UniqueName: \"kubernetes.io/projected/de5612b5-9ee0-4da5-84b0-402ce8b1a163-kube-api-access-jrfh2\") pod \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.821813 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-scripts\") pod \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.821889 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-config-data\") pod \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.821911 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-combined-ca-bundle\") pod \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.821991 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-fernet-keys\") pod \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.822070 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-credential-keys\") pod \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\" (UID: \"de5612b5-9ee0-4da5-84b0-402ce8b1a163\") " Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.822644 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.822665 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57q8w\" (UniqueName: \"kubernetes.io/projected/6b52c852-92ed-47f2-8e47-a9ac1e378698-kube-api-access-57q8w\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.822677 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hcvs\" (UniqueName: \"kubernetes.io/projected/276feef5-736d-4d00-a9af-e81eaf2a0285-kube-api-access-7hcvs\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.822690 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w52mq\" (UniqueName: \"kubernetes.io/projected/cbe09550-cc72-4fe9-af45-b39fbcac540d-kube-api-access-w52mq\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.822703 4751 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.822715 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b52c852-92ed-47f2-8e47-a9ac1e378698-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.822684 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-config-data" (OuterVolumeSpecName: "config-data") pod "6b52c852-92ed-47f2-8e47-a9ac1e378698" (UID: "6b52c852-92ed-47f2-8e47-a9ac1e378698"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.826197 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5612b5-9ee0-4da5-84b0-402ce8b1a163-kube-api-access-jrfh2" (OuterVolumeSpecName: "kube-api-access-jrfh2") pod "de5612b5-9ee0-4da5-84b0-402ce8b1a163" (UID: "de5612b5-9ee0-4da5-84b0-402ce8b1a163"). InnerVolumeSpecName "kube-api-access-jrfh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.835456 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "de5612b5-9ee0-4da5-84b0-402ce8b1a163" (UID: "de5612b5-9ee0-4da5-84b0-402ce8b1a163"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.841247 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "de5612b5-9ee0-4da5-84b0-402ce8b1a163" (UID: "de5612b5-9ee0-4da5-84b0-402ce8b1a163"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.861535 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-scripts" (OuterVolumeSpecName: "scripts") pod "de5612b5-9ee0-4da5-84b0-402ce8b1a163" (UID: "de5612b5-9ee0-4da5-84b0-402ce8b1a163"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.868290 4751 scope.go:117] "RemoveContainer" containerID="009618b36cec9e7aac50b9da194717c12033e1fab63046c3c865e34643526b68" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.924101 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.924128 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.924137 4751 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.924147 4751 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.924156 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrfh2\" (UniqueName: \"kubernetes.io/projected/de5612b5-9ee0-4da5-84b0-402ce8b1a163-kube-api-access-jrfh2\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.958954 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbe09550-cc72-4fe9-af45-b39fbcac540d" (UID: "cbe09550-cc72-4fe9-af45-b39fbcac540d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.977482 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.978863 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b52c852-92ed-47f2-8e47-a9ac1e378698" (UID: "6b52c852-92ed-47f2-8e47-a9ac1e378698"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:13 crc kubenswrapper[4751]: I1203 14:36:13.984982 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.029532 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b52c852-92ed-47f2-8e47-a9ac1e378698-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.029559 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe09550-cc72-4fe9-af45-b39fbcac540d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.053545 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-config" (OuterVolumeSpecName: "config") pod "276feef5-736d-4d00-a9af-e81eaf2a0285" (UID: "276feef5-736d-4d00-a9af-e81eaf2a0285"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.131083 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.135290 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de5612b5-9ee0-4da5-84b0-402ce8b1a163" (UID: "de5612b5-9ee0-4da5-84b0-402ce8b1a163"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.138584 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "276feef5-736d-4d00-a9af-e81eaf2a0285" (UID: "276feef5-736d-4d00-a9af-e81eaf2a0285"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.142838 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "276feef5-736d-4d00-a9af-e81eaf2a0285" (UID: "276feef5-736d-4d00-a9af-e81eaf2a0285"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.152516 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-config-data" (OuterVolumeSpecName: "config-data") pod "de5612b5-9ee0-4da5-84b0-402ce8b1a163" (UID: "de5612b5-9ee0-4da5-84b0-402ce8b1a163"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.155590 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "276feef5-736d-4d00-a9af-e81eaf2a0285" (UID: "276feef5-736d-4d00-a9af-e81eaf2a0285"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.157440 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "276feef5-736d-4d00-a9af-e81eaf2a0285" (UID: "276feef5-736d-4d00-a9af-e81eaf2a0285"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.233460 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.233765 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.233851 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.233931 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.234013 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5612b5-9ee0-4da5-84b0-402ce8b1a163-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.234087 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/276feef5-736d-4d00-a9af-e81eaf2a0285-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.430072 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mj9pc"] Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.450669 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mj9pc"] Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.730514 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.839252 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8svh4" event={"ID":"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd","Type":"ContainerStarted","Data":"2cd6e68e561871d96e82407b49eaa74d41888981969007b2a4ea1772ae66f465"} Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.901037 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerStarted","Data":"14f04502b0f362737565c4c435a8553a96ea535887b091113225455159452a02"} Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.904849 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkx6d" event={"ID":"4241fb0b-9af8-430d-a244-84a16d0679e2","Type":"ContainerStarted","Data":"4952abca2b7320f189e31ad0dd6a539cd7c0fca8cba3cfa6e05e2b7d46a9c878"} Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.959687 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-cfc86c59b-x4m2l"] Dec 03 14:36:14 crc kubenswrapper[4751]: E1203 14:36:14.960394 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe09550-cc72-4fe9-af45-b39fbcac540d" containerName="barbican-db-sync" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.960416 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe09550-cc72-4fe9-af45-b39fbcac540d" containerName="barbican-db-sync" Dec 03 14:36:14 crc kubenswrapper[4751]: E1203 14:36:14.960517 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5612b5-9ee0-4da5-84b0-402ce8b1a163" containerName="keystone-bootstrap" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.960526 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5612b5-9ee0-4da5-84b0-402ce8b1a163" containerName="keystone-bootstrap" Dec 03 14:36:14 crc kubenswrapper[4751]: E1203 14:36:14.960609 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276feef5-736d-4d00-a9af-e81eaf2a0285" containerName="dnsmasq-dns" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.960617 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="276feef5-736d-4d00-a9af-e81eaf2a0285" containerName="dnsmasq-dns" Dec 03 14:36:14 crc kubenswrapper[4751]: E1203 14:36:14.960630 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276feef5-736d-4d00-a9af-e81eaf2a0285" containerName="init" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.960637 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="276feef5-736d-4d00-a9af-e81eaf2a0285" containerName="init" Dec 03 14:36:14 crc kubenswrapper[4751]: E1203 14:36:14.960668 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b52c852-92ed-47f2-8e47-a9ac1e378698" containerName="placement-db-sync" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.960676 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b52c852-92ed-47f2-8e47-a9ac1e378698" containerName="placement-db-sync" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.962468 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8svh4" podStartSLOduration=8.903257327 podStartE2EDuration="13.962445504s" podCreationTimestamp="2025-12-03 14:36:01 +0000 UTC" firstStartedPulling="2025-12-03 14:36:08.532974746 +0000 UTC m=+1375.521329963" lastFinishedPulling="2025-12-03 14:36:13.592162923 +0000 UTC m=+1380.580518140" observedRunningTime="2025-12-03 14:36:14.891219812 +0000 UTC m=+1381.879575049" watchObservedRunningTime="2025-12-03 14:36:14.962445504 +0000 UTC m=+1381.950800711" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.967189 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe09550-cc72-4fe9-af45-b39fbcac540d" containerName="barbican-db-sync" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.967237 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="276feef5-736d-4d00-a9af-e81eaf2a0285" containerName="dnsmasq-dns" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.967272 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b52c852-92ed-47f2-8e47-a9ac1e378698" containerName="placement-db-sync" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.967285 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5612b5-9ee0-4da5-84b0-402ce8b1a163" containerName="keystone-bootstrap" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.975663 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.988859 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.989166 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4pjzg" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.989683 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.994011 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 14:36:14 crc kubenswrapper[4751]: I1203 14:36:14.994485 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.015888 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cfc86c59b-x4m2l"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.028035 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dkx6d" podStartSLOduration=5.995177302 podStartE2EDuration="11.028015661s" podCreationTimestamp="2025-12-03 14:36:04 +0000 UTC" firstStartedPulling="2025-12-03 14:36:08.531682361 +0000 UTC m=+1375.520037578" lastFinishedPulling="2025-12-03 14:36:13.56452072 +0000 UTC m=+1380.552875937" observedRunningTime="2025-12-03 14:36:14.944207386 +0000 UTC m=+1381.932562603" watchObservedRunningTime="2025-12-03 14:36:15.028015661 +0000 UTC m=+1382.016370878" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.055800 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-public-tls-certs\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.055899 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-combined-ca-bundle\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.055970 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk4h4\" (UniqueName: \"kubernetes.io/projected/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-kube-api-access-wk4h4\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.055986 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-internal-tls-certs\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.056007 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-scripts\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.056105 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-config-data\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.056164 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-logs\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.057730 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5dcd655495-dj2gs"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.059713 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.064383 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.065291 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.065518 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-c66qs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.107671 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7f9cb9cd-blmhg"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.109051 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.118242 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.118460 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.118569 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.118673 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2bmt6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.118838 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.120103 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.142723 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5dcd655495-dj2gs"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157282 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qz6f\" (UniqueName: \"kubernetes.io/projected/491c0713-5024-484d-921d-387200cb08b2-kube-api-access-8qz6f\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157358 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-config-data\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157386 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-internal-tls-certs\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157427 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-combined-ca-bundle\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157460 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw2sq\" (UniqueName: \"kubernetes.io/projected/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-kube-api-access-qw2sq\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157477 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-logs\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157491 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-config-data-custom\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157513 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-config-data\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157538 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-public-tls-certs\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157560 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-fernet-keys\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157585 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-public-tls-certs\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157605 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-combined-ca-bundle\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157651 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-scripts\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157689 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk4h4\" (UniqueName: \"kubernetes.io/projected/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-kube-api-access-wk4h4\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157710 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-internal-tls-certs\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157734 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-scripts\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157756 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-logs\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157773 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-config-data\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157795 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-credential-keys\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.157816 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-combined-ca-bundle\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.161951 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-logs\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.171043 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-scripts\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.176284 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f9cb9cd-blmhg"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.188936 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-public-tls-certs\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.190056 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-config-data\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.190838 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-combined-ca-bundle\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.205602 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk4h4\" (UniqueName: \"kubernetes.io/projected/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-kube-api-access-wk4h4\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.211482 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfdc9703-a6a9-4a1d-81f4-852aa9167a17-internal-tls-certs\") pod \"placement-cfc86c59b-x4m2l\" (UID: \"bfdc9703-a6a9-4a1d-81f4-852aa9167a17\") " pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.242472 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-c469d65bd-rdqn6"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.244124 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.247418 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.263486 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw2sq\" (UniqueName: \"kubernetes.io/projected/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-kube-api-access-qw2sq\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.263553 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-config-data-custom\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.264399 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-config-data\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.264475 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-public-tls-certs\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.264508 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-fernet-keys\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.264585 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-scripts\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.265586 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-logs\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.265607 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-config-data\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.265638 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-credential-keys\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.265666 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-combined-ca-bundle\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.265700 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qz6f\" (UniqueName: \"kubernetes.io/projected/491c0713-5024-484d-921d-387200cb08b2-kube-api-access-8qz6f\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.265721 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-internal-tls-certs\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.265757 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-combined-ca-bundle\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.271654 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-config-data\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.272131 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-logs\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.284439 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-scripts\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.284769 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-combined-ca-bundle\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.284877 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-combined-ca-bundle\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.287653 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-config-data-custom\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.297440 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-fernet-keys\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.298091 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c469d65bd-rdqn6"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.304443 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-credential-keys\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.304843 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-config-data\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.305153 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-internal-tls-certs\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.305272 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/491c0713-5024-484d-921d-387200cb08b2-public-tls-certs\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.309858 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qz6f\" (UniqueName: \"kubernetes.io/projected/491c0713-5024-484d-921d-387200cb08b2-kube-api-access-8qz6f\") pod \"keystone-7f9cb9cd-blmhg\" (UID: \"491c0713-5024-484d-921d-387200cb08b2\") " pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.314963 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw2sq\" (UniqueName: \"kubernetes.io/projected/4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f-kube-api-access-qw2sq\") pod \"barbican-worker-5dcd655495-dj2gs\" (UID: \"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f\") " pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.315447 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.367733 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k76vc\" (UniqueName: \"kubernetes.io/projected/58307992-3054-4b05-b7c6-f768c2a1e849-kube-api-access-k76vc\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.367944 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58307992-3054-4b05-b7c6-f768c2a1e849-logs\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.368507 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58307992-3054-4b05-b7c6-f768c2a1e849-combined-ca-bundle\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.368551 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58307992-3054-4b05-b7c6-f768c2a1e849-config-data\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.368603 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58307992-3054-4b05-b7c6-f768c2a1e849-config-data-custom\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.379916 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="276feef5-736d-4d00-a9af-e81eaf2a0285" path="/var/lib/kubelet/pods/276feef5-736d-4d00-a9af-e81eaf2a0285/volumes" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.394015 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qkd8q"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.395785 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.397147 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5dcd655495-dj2gs" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.419626 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qkd8q"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.442658 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.472924 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k76vc\" (UniqueName: \"kubernetes.io/projected/58307992-3054-4b05-b7c6-f768c2a1e849-kube-api-access-k76vc\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473010 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473045 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58307992-3054-4b05-b7c6-f768c2a1e849-logs\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473101 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473130 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hzqn\" (UniqueName: \"kubernetes.io/projected/6418aa65-88fe-4683-93d6-06b632a0bcd8-kube-api-access-2hzqn\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473167 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473197 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58307992-3054-4b05-b7c6-f768c2a1e849-combined-ca-bundle\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473221 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58307992-3054-4b05-b7c6-f768c2a1e849-config-data\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473238 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58307992-3054-4b05-b7c6-f768c2a1e849-config-data-custom\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473267 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.473289 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-config\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.477037 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58307992-3054-4b05-b7c6-f768c2a1e849-logs\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.479929 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58307992-3054-4b05-b7c6-f768c2a1e849-config-data-custom\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.483491 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58307992-3054-4b05-b7c6-f768c2a1e849-combined-ca-bundle\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.486318 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58307992-3054-4b05-b7c6-f768c2a1e849-config-data\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.500484 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55f7778fd-nbkr2"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.503258 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.512793 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k76vc\" (UniqueName: \"kubernetes.io/projected/58307992-3054-4b05-b7c6-f768c2a1e849-kube-api-access-k76vc\") pod \"barbican-keystone-listener-c469d65bd-rdqn6\" (UID: \"58307992-3054-4b05-b7c6-f768c2a1e849\") " pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.514121 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.562116 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55f7778fd-nbkr2"] Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.577862 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.578000 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-combined-ca-bundle\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.578034 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.584925 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.585410 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.578058 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hzqn\" (UniqueName: \"kubernetes.io/projected/6418aa65-88fe-4683-93d6-06b632a0bcd8-kube-api-access-2hzqn\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.597698 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data-custom\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.597787 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.597816 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.597883 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnskl\" (UniqueName: \"kubernetes.io/projected/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-kube-api-access-jnskl\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.597964 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.598005 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-config\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.598090 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-logs\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.599082 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.599812 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-config\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.619590 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.684023 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hzqn\" (UniqueName: \"kubernetes.io/projected/6418aa65-88fe-4683-93d6-06b632a0bcd8-kube-api-access-2hzqn\") pod \"dnsmasq-dns-848cf88cfc-qkd8q\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.706083 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnskl\" (UniqueName: \"kubernetes.io/projected/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-kube-api-access-jnskl\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.706424 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-logs\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.706744 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-combined-ca-bundle\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.706813 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data-custom\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.706884 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.707129 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-logs\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.714799 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-combined-ca-bundle\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.722186 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.733791 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data-custom\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.737351 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.736060 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnskl\" (UniqueName: \"kubernetes.io/projected/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-kube-api-access-jnskl\") pod \"barbican-api-55f7778fd-nbkr2\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.766736 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:15 crc kubenswrapper[4751]: I1203 14:36:15.789254 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.016236 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cfc86c59b-x4m2l"] Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.254067 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5dcd655495-dj2gs"] Dec 03 14:36:16 crc kubenswrapper[4751]: W1203 14:36:16.265798 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cb2ff1e_e8e7_4af2_b8fa_073de9b9613f.slice/crio-52e04fbe97f421e9851e16e55997fe2ad59ebfc9d2be581f929666f3662087e1 WatchSource:0}: Error finding container 52e04fbe97f421e9851e16e55997fe2ad59ebfc9d2be581f929666f3662087e1: Status 404 returned error can't find the container with id 52e04fbe97f421e9851e16e55997fe2ad59ebfc9d2be581f929666f3662087e1 Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.410107 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f9cb9cd-blmhg"] Dec 03 14:36:16 crc kubenswrapper[4751]: W1203 14:36:16.419034 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod491c0713_5024_484d_921d_387200cb08b2.slice/crio-03ac71619bead736371e3d60a244355906303c54df4bb9e71e5f4b65ec0b53cc WatchSource:0}: Error finding container 03ac71619bead736371e3d60a244355906303c54df4bb9e71e5f4b65ec0b53cc: Status 404 returned error can't find the container with id 03ac71619bead736371e3d60a244355906303c54df4bb9e71e5f4b65ec0b53cc Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.525805 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c469d65bd-rdqn6"] Dec 03 14:36:16 crc kubenswrapper[4751]: W1203 14:36:16.677017 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b931e90_9037_4b52_92c1_9c1d4d3fbba4.slice/crio-d0386b5b1cddd1418a734b8b721557b2e285618925d375a6474523cdecf63d70 WatchSource:0}: Error finding container d0386b5b1cddd1418a734b8b721557b2e285618925d375a6474523cdecf63d70: Status 404 returned error can't find the container with id d0386b5b1cddd1418a734b8b721557b2e285618925d375a6474523cdecf63d70 Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.677455 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55f7778fd-nbkr2"] Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.693066 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qkd8q"] Dec 03 14:36:16 crc kubenswrapper[4751]: W1203 14:36:16.772924 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6418aa65_88fe_4683_93d6_06b632a0bcd8.slice/crio-efa98643aad6f5e00a37e25204b6337be84e1fcb5fed6339eab0425bcdd0f2dc WatchSource:0}: Error finding container efa98643aad6f5e00a37e25204b6337be84e1fcb5fed6339eab0425bcdd0f2dc: Status 404 returned error can't find the container with id efa98643aad6f5e00a37e25204b6337be84e1fcb5fed6339eab0425bcdd0f2dc Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.878716 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bc6669df7-xxpxz" Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.944891 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64fb975db4-zhnvr"] Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.947707 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64fb975db4-zhnvr" podUID="90ae9890-60ca-4467-910c-a0d7459b189d" containerName="neutron-api" containerID="cri-o://f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597" gracePeriod=30 Dec 03 14:36:16 crc kubenswrapper[4751]: I1203 14:36:16.948421 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64fb975db4-zhnvr" podUID="90ae9890-60ca-4467-910c-a0d7459b189d" containerName="neutron-httpd" containerID="cri-o://cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687" gracePeriod=30 Dec 03 14:36:17 crc kubenswrapper[4751]: I1203 14:36:17.037470 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cfc86c59b-x4m2l" event={"ID":"bfdc9703-a6a9-4a1d-81f4-852aa9167a17","Type":"ContainerStarted","Data":"c5e5e2436b13240160e0901b3c0ce4c94fcdaec0626b40e043bd0d6d271d8bc2"} Dec 03 14:36:17 crc kubenswrapper[4751]: I1203 14:36:17.037505 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cfc86c59b-x4m2l" event={"ID":"bfdc9703-a6a9-4a1d-81f4-852aa9167a17","Type":"ContainerStarted","Data":"fe6138f3df9188b3f787ad398b7c738ab6315b74ec02869f944a925e3994bce8"} Dec 03 14:36:17 crc kubenswrapper[4751]: I1203 14:36:17.041557 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" event={"ID":"58307992-3054-4b05-b7c6-f768c2a1e849","Type":"ContainerStarted","Data":"ad68686f50f75896f1b96fcf5e3ac104e61847c4fff477367458dea7b8c54ab8"} Dec 03 14:36:17 crc kubenswrapper[4751]: I1203 14:36:17.042671 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7778fd-nbkr2" event={"ID":"4b931e90-9037-4b52-92c1-9c1d4d3fbba4","Type":"ContainerStarted","Data":"d0386b5b1cddd1418a734b8b721557b2e285618925d375a6474523cdecf63d70"} Dec 03 14:36:17 crc kubenswrapper[4751]: I1203 14:36:17.048277 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" event={"ID":"6418aa65-88fe-4683-93d6-06b632a0bcd8","Type":"ContainerStarted","Data":"efa98643aad6f5e00a37e25204b6337be84e1fcb5fed6339eab0425bcdd0f2dc"} Dec 03 14:36:17 crc kubenswrapper[4751]: I1203 14:36:17.050265 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5dcd655495-dj2gs" event={"ID":"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f","Type":"ContainerStarted","Data":"52e04fbe97f421e9851e16e55997fe2ad59ebfc9d2be581f929666f3662087e1"} Dec 03 14:36:17 crc kubenswrapper[4751]: I1203 14:36:17.052376 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f9cb9cd-blmhg" event={"ID":"491c0713-5024-484d-921d-387200cb08b2","Type":"ContainerStarted","Data":"03ac71619bead736371e3d60a244355906303c54df4bb9e71e5f4b65ec0b53cc"} Dec 03 14:36:17 crc kubenswrapper[4751]: I1203 14:36:17.053643 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:17 crc kubenswrapper[4751]: I1203 14:36:17.086060 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7f9cb9cd-blmhg" podStartSLOduration=3.086038366 podStartE2EDuration="3.086038366s" podCreationTimestamp="2025-12-03 14:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:17.076631729 +0000 UTC m=+1384.064986956" watchObservedRunningTime="2025-12-03 14:36:17.086038366 +0000 UTC m=+1384.074393583" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.023746 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.025194 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.030957 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.085759 4751 generic.go:334] "Generic (PLEG): container finished" podID="90ae9890-60ca-4467-910c-a0d7459b189d" containerID="cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687" exitCode=0 Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.085921 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64fb975db4-zhnvr" event={"ID":"90ae9890-60ca-4467-910c-a0d7459b189d","Type":"ContainerDied","Data":"cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687"} Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.088422 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f9cb9cd-blmhg" event={"ID":"491c0713-5024-484d-921d-387200cb08b2","Type":"ContainerStarted","Data":"6b4fde27a0e577a67e98d7209443099dc1e24201242e8894d121d57626fe422d"} Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.109115 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cfc86c59b-x4m2l" event={"ID":"bfdc9703-a6a9-4a1d-81f4-852aa9167a17","Type":"ContainerStarted","Data":"1bb33975df7b9e1585756061db13ca21fa16f19cc6fdc511cdb03580a3f12018"} Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.109318 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.124081 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7778fd-nbkr2" event={"ID":"4b931e90-9037-4b52-92c1-9c1d4d3fbba4","Type":"ContainerStarted","Data":"ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e"} Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.132970 4751 generic.go:334] "Generic (PLEG): container finished" podID="6418aa65-88fe-4683-93d6-06b632a0bcd8" containerID="9450cc0236c02fe60f3759d3b841fb1b5252adb92ebe0e8a5fc66444fa571938" exitCode=0 Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.136638 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-cfc86c59b-x4m2l" podStartSLOduration=4.136607801 podStartE2EDuration="4.136607801s" podCreationTimestamp="2025-12-03 14:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:18.129199139 +0000 UTC m=+1385.117554356" watchObservedRunningTime="2025-12-03 14:36:18.136607801 +0000 UTC m=+1385.124963038" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.137425 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" event={"ID":"6418aa65-88fe-4683-93d6-06b632a0bcd8","Type":"ContainerDied","Data":"9450cc0236c02fe60f3759d3b841fb1b5252adb92ebe0e8a5fc66444fa571938"} Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.518087 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.518542 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.645453 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b567465d6-ch8tf"] Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.653200 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.670170 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.670802 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.679237 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b567465d6-ch8tf"] Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.741587 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-internal-tls-certs\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.741662 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-combined-ca-bundle\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.741708 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-public-tls-certs\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.741775 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-config-data\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.741896 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5w9m\" (UniqueName: \"kubernetes.io/projected/1440ca21-e220-4178-b44c-06672479bc7c-kube-api-access-n5w9m\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.741933 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-config-data-custom\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.741969 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1440ca21-e220-4178-b44c-06672479bc7c-logs\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.843437 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1440ca21-e220-4178-b44c-06672479bc7c-logs\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.843578 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-internal-tls-certs\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.843613 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-combined-ca-bundle\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.843635 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-public-tls-certs\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.843695 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-config-data\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.843773 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5w9m\" (UniqueName: \"kubernetes.io/projected/1440ca21-e220-4178-b44c-06672479bc7c-kube-api-access-n5w9m\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.843807 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-config-data-custom\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.845779 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1440ca21-e220-4178-b44c-06672479bc7c-logs\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.853985 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-config-data-custom\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.855007 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-internal-tls-certs\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.857004 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-public-tls-certs\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.867159 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-config-data\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.876069 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1440ca21-e220-4178-b44c-06672479bc7c-combined-ca-bundle\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:18 crc kubenswrapper[4751]: I1203 14:36:18.877021 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5w9m\" (UniqueName: \"kubernetes.io/projected/1440ca21-e220-4178-b44c-06672479bc7c-kube-api-access-n5w9m\") pod \"barbican-api-b567465d6-ch8tf\" (UID: \"1440ca21-e220-4178-b44c-06672479bc7c\") " pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.009100 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.145286 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7778fd-nbkr2" event={"ID":"4b931e90-9037-4b52-92c1-9c1d4d3fbba4","Type":"ContainerStarted","Data":"e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea"} Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.145364 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.145386 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.148290 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" event={"ID":"6418aa65-88fe-4683-93d6-06b632a0bcd8","Type":"ContainerStarted","Data":"42d19883f61256bbad932362c00ae3e1bce917a7df937f6cf634a6bffdf5f542"} Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.149511 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.153207 4751 generic.go:334] "Generic (PLEG): container finished" podID="4df7d14f-4a52-43be-9877-c5df9c015cc7" containerID="f74eebf05afb03acb40dab92e5dd6d14e9c474658599c36667cf2b6599a9e351" exitCode=0 Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.153760 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-bgs8x" event={"ID":"4df7d14f-4a52-43be-9877-c5df9c015cc7","Type":"ContainerDied","Data":"f74eebf05afb03acb40dab92e5dd6d14e9c474658599c36667cf2b6599a9e351"} Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.153852 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:19 crc kubenswrapper[4751]: I1203 14:36:19.166667 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55f7778fd-nbkr2" podStartSLOduration=4.166644097 podStartE2EDuration="4.166644097s" podCreationTimestamp="2025-12-03 14:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:19.161139357 +0000 UTC m=+1386.149494584" watchObservedRunningTime="2025-12-03 14:36:19.166644097 +0000 UTC m=+1386.154999314" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.167839 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" event={"ID":"58307992-3054-4b05-b7c6-f768c2a1e849","Type":"ContainerStarted","Data":"e0c80ac044d10c83854a49ae0e4a45730d6b201657ff892e3b579b6e373e7757"} Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.170412 4751 generic.go:334] "Generic (PLEG): container finished" podID="4784bf8d-4315-4097-b729-1f21940a17bc" containerID="1c63976e3db62586993d795217316bd30f339b8441e6b64054a5db80e8519f5c" exitCode=0 Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.170470 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5qnpr" event={"ID":"4784bf8d-4315-4097-b729-1f21940a17bc","Type":"ContainerDied","Data":"1c63976e3db62586993d795217316bd30f339b8441e6b64054a5db80e8519f5c"} Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.172954 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5dcd655495-dj2gs" event={"ID":"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f","Type":"ContainerStarted","Data":"72f69cd8aed902d767f91c9a63ced091e92f7e84a9d1c4f226424ec1fefec9e1"} Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.190167 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" podStartSLOduration=5.190147015 podStartE2EDuration="5.190147015s" podCreationTimestamp="2025-12-03 14:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:19.198035873 +0000 UTC m=+1386.186391100" watchObservedRunningTime="2025-12-03 14:36:20.190147015 +0000 UTC m=+1387.178502232" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.319742 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b567465d6-ch8tf"] Dec 03 14:36:20 crc kubenswrapper[4751]: W1203 14:36:20.337931 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1440ca21_e220_4178_b44c_06672479bc7c.slice/crio-13644db3bfd3d9c10584f73d089ffc66d23a90b9d0749c3227a0027c38a27c7a WatchSource:0}: Error finding container 13644db3bfd3d9c10584f73d089ffc66d23a90b9d0749c3227a0027c38a27c7a: Status 404 returned error can't find the container with id 13644db3bfd3d9c10584f73d089ffc66d23a90b9d0749c3227a0027c38a27c7a Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.480882 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tmdtc"] Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.484292 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.502254 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmdtc"] Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.556408 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.581935 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-utilities\") pod \"community-operators-tmdtc\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.582070 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq72k\" (UniqueName: \"kubernetes.io/projected/3506627c-9636-4af8-ad7f-78db94f0ff11-kube-api-access-mq72k\") pod \"community-operators-tmdtc\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.582137 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-catalog-content\") pod \"community-operators-tmdtc\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.683874 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-scripts\") pod \"4df7d14f-4a52-43be-9877-c5df9c015cc7\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.684008 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-config-data\") pod \"4df7d14f-4a52-43be-9877-c5df9c015cc7\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.684097 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcplz\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-kube-api-access-zcplz\") pod \"4df7d14f-4a52-43be-9877-c5df9c015cc7\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.684133 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-certs\") pod \"4df7d14f-4a52-43be-9877-c5df9c015cc7\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.684158 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-combined-ca-bundle\") pod \"4df7d14f-4a52-43be-9877-c5df9c015cc7\" (UID: \"4df7d14f-4a52-43be-9877-c5df9c015cc7\") " Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.687046 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq72k\" (UniqueName: \"kubernetes.io/projected/3506627c-9636-4af8-ad7f-78db94f0ff11-kube-api-access-mq72k\") pod \"community-operators-tmdtc\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.687160 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-catalog-content\") pod \"community-operators-tmdtc\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.687273 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-utilities\") pod \"community-operators-tmdtc\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.687695 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-catalog-content\") pod \"community-operators-tmdtc\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.687938 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-utilities\") pod \"community-operators-tmdtc\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.690969 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-scripts" (OuterVolumeSpecName: "scripts") pod "4df7d14f-4a52-43be-9877-c5df9c015cc7" (UID: "4df7d14f-4a52-43be-9877-c5df9c015cc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.691538 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-kube-api-access-zcplz" (OuterVolumeSpecName: "kube-api-access-zcplz") pod "4df7d14f-4a52-43be-9877-c5df9c015cc7" (UID: "4df7d14f-4a52-43be-9877-c5df9c015cc7"). InnerVolumeSpecName "kube-api-access-zcplz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.692652 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-certs" (OuterVolumeSpecName: "certs") pod "4df7d14f-4a52-43be-9877-c5df9c015cc7" (UID: "4df7d14f-4a52-43be-9877-c5df9c015cc7"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.711721 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq72k\" (UniqueName: \"kubernetes.io/projected/3506627c-9636-4af8-ad7f-78db94f0ff11-kube-api-access-mq72k\") pod \"community-operators-tmdtc\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.730076 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4df7d14f-4a52-43be-9877-c5df9c015cc7" (UID: "4df7d14f-4a52-43be-9877-c5df9c015cc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.733777 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-config-data" (OuterVolumeSpecName: "config-data") pod "4df7d14f-4a52-43be-9877-c5df9c015cc7" (UID: "4df7d14f-4a52-43be-9877-c5df9c015cc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.788982 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.789228 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcplz\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-kube-api-access-zcplz\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.789313 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4df7d14f-4a52-43be-9877-c5df9c015cc7-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.789440 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.789521 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df7d14f-4a52-43be-9877-c5df9c015cc7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:20 crc kubenswrapper[4751]: I1203 14:36:20.837434 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.237936 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-bgs8x" event={"ID":"4df7d14f-4a52-43be-9877-c5df9c015cc7","Type":"ContainerDied","Data":"8eb90a04b5fe0caa0613ada1569a95c8003a835945534bb50c59e00e8ec1f479"} Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.238240 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8eb90a04b5fe0caa0613ada1569a95c8003a835945534bb50c59e00e8ec1f479" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.238306 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-bgs8x" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.246026 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b567465d6-ch8tf" event={"ID":"1440ca21-e220-4178-b44c-06672479bc7c","Type":"ContainerStarted","Data":"0166b79fc1bdb9db67d728ec7d411ece75182f5839d93c4f2cec71b96d262480"} Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.246078 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b567465d6-ch8tf" event={"ID":"1440ca21-e220-4178-b44c-06672479bc7c","Type":"ContainerStarted","Data":"f9f1c5107a7fd9373fde7f6f16974788d342901f3182857471ea657b50656e41"} Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.246087 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b567465d6-ch8tf" event={"ID":"1440ca21-e220-4178-b44c-06672479bc7c","Type":"ContainerStarted","Data":"13644db3bfd3d9c10584f73d089ffc66d23a90b9d0749c3227a0027c38a27c7a"} Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.247228 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.247251 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.271513 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" event={"ID":"58307992-3054-4b05-b7c6-f768c2a1e849","Type":"ContainerStarted","Data":"f290361418a357d0705ba8dc391de2ed4ec5fa4d8eda3beea02c8cf03d68e92b"} Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.280855 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5dcd655495-dj2gs" event={"ID":"4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f","Type":"ContainerStarted","Data":"df5819e0a2799bcff6036b59331b0f082a84819e0ffbfe070cae188e5ec41a7b"} Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.337981 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5dcd655495-dj2gs" podStartSLOduration=3.835470143 podStartE2EDuration="7.33795629s" podCreationTimestamp="2025-12-03 14:36:14 +0000 UTC" firstStartedPulling="2025-12-03 14:36:16.268000009 +0000 UTC m=+1383.256355226" lastFinishedPulling="2025-12-03 14:36:19.770486156 +0000 UTC m=+1386.758841373" observedRunningTime="2025-12-03 14:36:21.307624913 +0000 UTC m=+1388.295980160" watchObservedRunningTime="2025-12-03 14:36:21.33795629 +0000 UTC m=+1388.326311507" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.340239 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b567465d6-ch8tf" podStartSLOduration=3.3402259020000002 podStartE2EDuration="3.340225902s" podCreationTimestamp="2025-12-03 14:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:21.285665365 +0000 UTC m=+1388.274020582" watchObservedRunningTime="2025-12-03 14:36:21.340225902 +0000 UTC m=+1388.328581119" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.371393 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-n9kjc"] Dec 03 14:36:21 crc kubenswrapper[4751]: E1203 14:36:21.371901 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df7d14f-4a52-43be-9877-c5df9c015cc7" containerName="cloudkitty-db-sync" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.371913 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df7d14f-4a52-43be-9877-c5df9c015cc7" containerName="cloudkitty-db-sync" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.372108 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4df7d14f-4a52-43be-9877-c5df9c015cc7" containerName="cloudkitty-db-sync" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.372872 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.375079 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-c469d65bd-rdqn6" podStartSLOduration=3.172269493 podStartE2EDuration="6.375055741s" podCreationTimestamp="2025-12-03 14:36:15 +0000 UTC" firstStartedPulling="2025-12-03 14:36:16.574074732 +0000 UTC m=+1383.562429949" lastFinishedPulling="2025-12-03 14:36:19.77686098 +0000 UTC m=+1386.765216197" observedRunningTime="2025-12-03 14:36:21.328017669 +0000 UTC m=+1388.316372886" watchObservedRunningTime="2025-12-03 14:36:21.375055741 +0000 UTC m=+1388.363410958" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.376269 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-mdxtz" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.376552 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.376743 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.377079 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.377282 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.406907 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-n9kjc"] Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.501963 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmdtc"] Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.507688 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-certs\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.507740 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn6rw\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-kube-api-access-nn6rw\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.507784 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-config-data\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.507870 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-scripts\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.508570 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-combined-ca-bundle\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.516271 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.516320 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.571526 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.610912 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-combined-ca-bundle\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.611043 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-certs\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.611064 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn6rw\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-kube-api-access-nn6rw\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.611098 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-config-data\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.611159 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-scripts\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.629258 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-scripts\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.629443 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-certs\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.629675 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-config-data\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.632920 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-combined-ca-bundle\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.640872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn6rw\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-kube-api-access-nn6rw\") pod \"cloudkitty-storageinit-n9kjc\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.700977 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.760235 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.932274 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjdz6\" (UniqueName: \"kubernetes.io/projected/4784bf8d-4315-4097-b729-1f21940a17bc-kube-api-access-cjdz6\") pod \"4784bf8d-4315-4097-b729-1f21940a17bc\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.932729 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-config-data\") pod \"4784bf8d-4315-4097-b729-1f21940a17bc\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.932774 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-db-sync-config-data\") pod \"4784bf8d-4315-4097-b729-1f21940a17bc\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.932855 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-combined-ca-bundle\") pod \"4784bf8d-4315-4097-b729-1f21940a17bc\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.932986 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4784bf8d-4315-4097-b729-1f21940a17bc-etc-machine-id\") pod \"4784bf8d-4315-4097-b729-1f21940a17bc\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.933037 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-scripts\") pod \"4784bf8d-4315-4097-b729-1f21940a17bc\" (UID: \"4784bf8d-4315-4097-b729-1f21940a17bc\") " Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.934883 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4784bf8d-4315-4097-b729-1f21940a17bc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4784bf8d-4315-4097-b729-1f21940a17bc" (UID: "4784bf8d-4315-4097-b729-1f21940a17bc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.940975 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-scripts" (OuterVolumeSpecName: "scripts") pod "4784bf8d-4315-4097-b729-1f21940a17bc" (UID: "4784bf8d-4315-4097-b729-1f21940a17bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.941163 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4784bf8d-4315-4097-b729-1f21940a17bc-kube-api-access-cjdz6" (OuterVolumeSpecName: "kube-api-access-cjdz6") pod "4784bf8d-4315-4097-b729-1f21940a17bc" (UID: "4784bf8d-4315-4097-b729-1f21940a17bc"). InnerVolumeSpecName "kube-api-access-cjdz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.964680 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4784bf8d-4315-4097-b729-1f21940a17bc" (UID: "4784bf8d-4315-4097-b729-1f21940a17bc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:21 crc kubenswrapper[4751]: I1203 14:36:21.987379 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4784bf8d-4315-4097-b729-1f21940a17bc" (UID: "4784bf8d-4315-4097-b729-1f21940a17bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.030849 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-config-data" (OuterVolumeSpecName: "config-data") pod "4784bf8d-4315-4097-b729-1f21940a17bc" (UID: "4784bf8d-4315-4097-b729-1f21940a17bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.039367 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.039396 4751 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.039406 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.039415 4751 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4784bf8d-4315-4097-b729-1f21940a17bc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.039423 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784bf8d-4315-4097-b729-1f21940a17bc-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.039431 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjdz6\" (UniqueName: \"kubernetes.io/projected/4784bf8d-4315-4097-b729-1f21940a17bc-kube-api-access-cjdz6\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.177031 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.291228 4751 generic.go:334] "Generic (PLEG): container finished" podID="90ae9890-60ca-4467-910c-a0d7459b189d" containerID="f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597" exitCode=0 Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.291363 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64fb975db4-zhnvr" event={"ID":"90ae9890-60ca-4467-910c-a0d7459b189d","Type":"ContainerDied","Data":"f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597"} Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.291398 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64fb975db4-zhnvr" event={"ID":"90ae9890-60ca-4467-910c-a0d7459b189d","Type":"ContainerDied","Data":"3ca3081b173f0212f9e4cd608bcaad8bbedf8591db454971c5d63f45ab290861"} Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.291446 4751 scope.go:117] "RemoveContainer" containerID="cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.291457 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64fb975db4-zhnvr" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.300020 4751 generic.go:334] "Generic (PLEG): container finished" podID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerID="425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f" exitCode=0 Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.300173 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmdtc" event={"ID":"3506627c-9636-4af8-ad7f-78db94f0ff11","Type":"ContainerDied","Data":"425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f"} Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.300207 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmdtc" event={"ID":"3506627c-9636-4af8-ad7f-78db94f0ff11","Type":"ContainerStarted","Data":"cbca48a20bd8feb1a6a8ace706988711b04270d2f425fb47789347257cd7cc36"} Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.302841 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5qnpr" event={"ID":"4784bf8d-4315-4097-b729-1f21940a17bc","Type":"ContainerDied","Data":"14e3dbc94365a434c8cb783c952ce106cab41956f910790139b2c65f32a8a218"} Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.302859 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e3dbc94365a434c8cb783c952ce106cab41956f910790139b2c65f32a8a218" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.302953 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5qnpr" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.341154 4751 scope.go:117] "RemoveContainer" containerID="f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.345021 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-httpd-config\") pod \"90ae9890-60ca-4467-910c-a0d7459b189d\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.345066 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62jsr\" (UniqueName: \"kubernetes.io/projected/90ae9890-60ca-4467-910c-a0d7459b189d-kube-api-access-62jsr\") pod \"90ae9890-60ca-4467-910c-a0d7459b189d\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.345213 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-ovndb-tls-certs\") pod \"90ae9890-60ca-4467-910c-a0d7459b189d\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.345271 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-config\") pod \"90ae9890-60ca-4467-910c-a0d7459b189d\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.345349 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-combined-ca-bundle\") pod \"90ae9890-60ca-4467-910c-a0d7459b189d\" (UID: \"90ae9890-60ca-4467-910c-a0d7459b189d\") " Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.348885 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-n9kjc"] Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.361126 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "90ae9890-60ca-4467-910c-a0d7459b189d" (UID: "90ae9890-60ca-4467-910c-a0d7459b189d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.401369 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90ae9890-60ca-4467-910c-a0d7459b189d-kube-api-access-62jsr" (OuterVolumeSpecName: "kube-api-access-62jsr") pod "90ae9890-60ca-4467-910c-a0d7459b189d" (UID: "90ae9890-60ca-4467-910c-a0d7459b189d"). InnerVolumeSpecName "kube-api-access-62jsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.401501 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.439945 4751 scope.go:117] "RemoveContainer" containerID="cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687" Dec 03 14:36:22 crc kubenswrapper[4751]: E1203 14:36:22.441040 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687\": container with ID starting with cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687 not found: ID does not exist" containerID="cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.441073 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687"} err="failed to get container status \"cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687\": rpc error: code = NotFound desc = could not find container \"cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687\": container with ID starting with cc583d54ccd93238db50746e8b35597e2cdc5954daec6159a5ee2eea6c733687 not found: ID does not exist" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.441091 4751 scope.go:117] "RemoveContainer" containerID="f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597" Dec 03 14:36:22 crc kubenswrapper[4751]: E1203 14:36:22.448878 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597\": container with ID starting with f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597 not found: ID does not exist" containerID="f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.448959 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597"} err="failed to get container status \"f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597\": rpc error: code = NotFound desc = could not find container \"f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597\": container with ID starting with f9df3666dcd520f70624df73a6db9a57b8a82ef464068a5628d967755ab6d597 not found: ID does not exist" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.458864 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.459044 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62jsr\" (UniqueName: \"kubernetes.io/projected/90ae9890-60ca-4467-910c-a0d7459b189d-kube-api-access-62jsr\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.481865 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:36:22 crc kubenswrapper[4751]: E1203 14:36:22.484543 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ae9890-60ca-4467-910c-a0d7459b189d" containerName="neutron-httpd" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.484588 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ae9890-60ca-4467-910c-a0d7459b189d" containerName="neutron-httpd" Dec 03 14:36:22 crc kubenswrapper[4751]: E1203 14:36:22.484678 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ae9890-60ca-4467-910c-a0d7459b189d" containerName="neutron-api" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.484690 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ae9890-60ca-4467-910c-a0d7459b189d" containerName="neutron-api" Dec 03 14:36:22 crc kubenswrapper[4751]: E1203 14:36:22.484717 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4784bf8d-4315-4097-b729-1f21940a17bc" containerName="cinder-db-sync" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.484724 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4784bf8d-4315-4097-b729-1f21940a17bc" containerName="cinder-db-sync" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.492054 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ae9890-60ca-4467-910c-a0d7459b189d" containerName="neutron-api" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.492123 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ae9890-60ca-4467-910c-a0d7459b189d" containerName="neutron-httpd" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.492181 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4784bf8d-4315-4097-b729-1f21940a17bc" containerName="cinder-db-sync" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.497439 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.509982 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lwkf5" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.510387 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.510561 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.524698 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.581796 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-config" (OuterVolumeSpecName: "config") pod "90ae9890-60ca-4467-910c-a0d7459b189d" (UID: "90ae9890-60ca-4467-910c-a0d7459b189d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.583221 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.606483 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90ae9890-60ca-4467-910c-a0d7459b189d" (UID: "90ae9890-60ca-4467-910c-a0d7459b189d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.620528 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qkd8q"] Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.620963 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" podUID="6418aa65-88fe-4683-93d6-06b632a0bcd8" containerName="dnsmasq-dns" containerID="cri-o://42d19883f61256bbad932362c00ae3e1bce917a7df937f6cf634a6bffdf5f542" gracePeriod=10 Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.668710 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx7fr\" (UniqueName: \"kubernetes.io/projected/f903006b-ff42-40bb-9572-6e907afa1f0e-kube-api-access-wx7fr\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.668754 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.669148 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.669294 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-scripts\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.669348 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f903006b-ff42-40bb-9572-6e907afa1f0e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.669398 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.669528 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.669543 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.676908 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "90ae9890-60ca-4467-910c-a0d7459b189d" (UID: "90ae9890-60ca-4467-910c-a0d7459b189d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.689515 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-j8j2p"] Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.691222 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.717798 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-j8j2p"] Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.729863 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.736229 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.739034 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.747402 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.771472 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-scripts\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.771584 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f903006b-ff42-40bb-9572-6e907afa1f0e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.771688 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.771766 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f903006b-ff42-40bb-9572-6e907afa1f0e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.771829 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx7fr\" (UniqueName: \"kubernetes.io/projected/f903006b-ff42-40bb-9572-6e907afa1f0e-kube-api-access-wx7fr\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.771865 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.772020 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.772187 4751 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ae9890-60ca-4467-910c-a0d7459b189d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.778214 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.782987 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.784486 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-scripts\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.799014 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx7fr\" (UniqueName: \"kubernetes.io/projected/f903006b-ff42-40bb-9572-6e907afa1f0e-kube-api-access-wx7fr\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.815271 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.876078 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2249aa73-cbf1-4a6b-a893-d2242c236c6d-logs\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.876420 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-svc\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.876463 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.876499 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.876837 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2249aa73-cbf1-4a6b-a893-d2242c236c6d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.876884 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.876907 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.876968 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-config\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.877010 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft5lv\" (UniqueName: \"kubernetes.io/projected/b1846d92-9079-4edd-9c41-fdfba9354a2a-kube-api-access-ft5lv\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.877053 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.877102 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs2c8\" (UniqueName: \"kubernetes.io/projected/2249aa73-cbf1-4a6b-a893-d2242c236c6d-kube-api-access-fs2c8\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.877288 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data-custom\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.877364 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-scripts\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980303 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980382 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980463 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2249aa73-cbf1-4a6b-a893-d2242c236c6d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980493 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980510 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980547 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-config\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980574 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft5lv\" (UniqueName: \"kubernetes.io/projected/b1846d92-9079-4edd-9c41-fdfba9354a2a-kube-api-access-ft5lv\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980600 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980629 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs2c8\" (UniqueName: \"kubernetes.io/projected/2249aa73-cbf1-4a6b-a893-d2242c236c6d-kube-api-access-fs2c8\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980697 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data-custom\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980728 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-scripts\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980746 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2249aa73-cbf1-4a6b-a893-d2242c236c6d-logs\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.980760 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-svc\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.981385 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.981470 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-svc\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.981981 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.982721 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2249aa73-cbf1-4a6b-a893-d2242c236c6d-logs\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.982784 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2249aa73-cbf1-4a6b-a893-d2242c236c6d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.985232 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.986401 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-config\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.987982 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-scripts\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.988246 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.997106 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:22 crc kubenswrapper[4751]: I1203 14:36:22.997526 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data-custom\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:22.998603 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft5lv\" (UniqueName: \"kubernetes.io/projected/b1846d92-9079-4edd-9c41-fdfba9354a2a-kube-api-access-ft5lv\") pod \"dnsmasq-dns-6578955fd5-j8j2p\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:22.998688 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs2c8\" (UniqueName: \"kubernetes.io/projected/2249aa73-cbf1-4a6b-a893-d2242c236c6d-kube-api-access-fs2c8\") pod \"cinder-api-0\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " pod="openstack/cinder-api-0" Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.099274 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.233238 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.236595 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64fb975db4-zhnvr"] Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.247790 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.250957 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-64fb975db4-zhnvr"] Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.360176 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90ae9890-60ca-4467-910c-a0d7459b189d" path="/var/lib/kubelet/pods/90ae9890-60ca-4467-910c-a0d7459b189d/volumes" Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.381236 4751 generic.go:334] "Generic (PLEG): container finished" podID="6418aa65-88fe-4683-93d6-06b632a0bcd8" containerID="42d19883f61256bbad932362c00ae3e1bce917a7df937f6cf634a6bffdf5f542" exitCode=0 Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.381309 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" event={"ID":"6418aa65-88fe-4683-93d6-06b632a0bcd8","Type":"ContainerDied","Data":"42d19883f61256bbad932362c00ae3e1bce917a7df937f6cf634a6bffdf5f542"} Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.384561 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-n9kjc" event={"ID":"8729b9a9-11fa-432d-bc45-09172fc6bbc7","Type":"ContainerStarted","Data":"4e9e928ddfecf73d9d2071ffd53ed479c47069cbba3c473650e74130322b7ba9"} Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.657307 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.850650 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:23 crc kubenswrapper[4751]: W1203 14:36:23.888930 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2249aa73_cbf1_4a6b_a893_d2242c236c6d.slice/crio-bc9a85e2ca81acd206b0152f0bfd261d4e136529a541e8f35475d63be0a0d8de WatchSource:0}: Error finding container bc9a85e2ca81acd206b0152f0bfd261d4e136529a541e8f35475d63be0a0d8de: Status 404 returned error can't find the container with id bc9a85e2ca81acd206b0152f0bfd261d4e136529a541e8f35475d63be0a0d8de Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.910473 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:36:23 crc kubenswrapper[4751]: I1203 14:36:23.924923 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-j8j2p"] Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.025933 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-swift-storage-0\") pod \"6418aa65-88fe-4683-93d6-06b632a0bcd8\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.026285 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-sb\") pod \"6418aa65-88fe-4683-93d6-06b632a0bcd8\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.026448 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-config\") pod \"6418aa65-88fe-4683-93d6-06b632a0bcd8\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.026500 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hzqn\" (UniqueName: \"kubernetes.io/projected/6418aa65-88fe-4683-93d6-06b632a0bcd8-kube-api-access-2hzqn\") pod \"6418aa65-88fe-4683-93d6-06b632a0bcd8\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.026554 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-svc\") pod \"6418aa65-88fe-4683-93d6-06b632a0bcd8\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.026616 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-nb\") pod \"6418aa65-88fe-4683-93d6-06b632a0bcd8\" (UID: \"6418aa65-88fe-4683-93d6-06b632a0bcd8\") " Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.031012 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6418aa65-88fe-4683-93d6-06b632a0bcd8-kube-api-access-2hzqn" (OuterVolumeSpecName: "kube-api-access-2hzqn") pod "6418aa65-88fe-4683-93d6-06b632a0bcd8" (UID: "6418aa65-88fe-4683-93d6-06b632a0bcd8"). InnerVolumeSpecName "kube-api-access-2hzqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.082358 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6418aa65-88fe-4683-93d6-06b632a0bcd8" (UID: "6418aa65-88fe-4683-93d6-06b632a0bcd8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.106427 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6418aa65-88fe-4683-93d6-06b632a0bcd8" (UID: "6418aa65-88fe-4683-93d6-06b632a0bcd8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.108579 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6418aa65-88fe-4683-93d6-06b632a0bcd8" (UID: "6418aa65-88fe-4683-93d6-06b632a0bcd8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.114193 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-config" (OuterVolumeSpecName: "config") pod "6418aa65-88fe-4683-93d6-06b632a0bcd8" (UID: "6418aa65-88fe-4683-93d6-06b632a0bcd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.120540 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6418aa65-88fe-4683-93d6-06b632a0bcd8" (UID: "6418aa65-88fe-4683-93d6-06b632a0bcd8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.129698 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.129737 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.129753 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.129765 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.129778 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6418aa65-88fe-4683-93d6-06b632a0bcd8-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.129791 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hzqn\" (UniqueName: \"kubernetes.io/projected/6418aa65-88fe-4683-93d6-06b632a0bcd8-kube-api-access-2hzqn\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.418228 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" event={"ID":"6418aa65-88fe-4683-93d6-06b632a0bcd8","Type":"ContainerDied","Data":"efa98643aad6f5e00a37e25204b6337be84e1fcb5fed6339eab0425bcdd0f2dc"} Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.418630 4751 scope.go:117] "RemoveContainer" containerID="42d19883f61256bbad932362c00ae3e1bce917a7df937f6cf634a6bffdf5f542" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.418533 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-qkd8q" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.423833 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f903006b-ff42-40bb-9572-6e907afa1f0e","Type":"ContainerStarted","Data":"32a8cc6022da47b15ab8e064c6bbdb22f839808a21fc154a18c0bbfa202e8e10"} Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.425921 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2249aa73-cbf1-4a6b-a893-d2242c236c6d","Type":"ContainerStarted","Data":"bc9a85e2ca81acd206b0152f0bfd261d4e136529a541e8f35475d63be0a0d8de"} Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.427225 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" event={"ID":"b1846d92-9079-4edd-9c41-fdfba9354a2a","Type":"ContainerStarted","Data":"8405fb6e7868a07f7fdb87d891a13004adeea73c99d6cd47872e0eb329a8eadb"} Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.463307 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qkd8q"] Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.473020 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qkd8q"] Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.530156 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.531456 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:24 crc kubenswrapper[4751]: I1203 14:36:24.595544 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:25 crc kubenswrapper[4751]: I1203 14:36:25.065506 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8svh4"] Dec 03 14:36:25 crc kubenswrapper[4751]: I1203 14:36:25.066030 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8svh4" podUID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerName="registry-server" containerID="cri-o://2cd6e68e561871d96e82407b49eaa74d41888981969007b2a4ea1772ae66f465" gracePeriod=2 Dec 03 14:36:25 crc kubenswrapper[4751]: I1203 14:36:25.131170 4751 scope.go:117] "RemoveContainer" containerID="9450cc0236c02fe60f3759d3b841fb1b5252adb92ebe0e8a5fc66444fa571938" Dec 03 14:36:25 crc kubenswrapper[4751]: I1203 14:36:25.344984 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6418aa65-88fe-4683-93d6-06b632a0bcd8" path="/var/lib/kubelet/pods/6418aa65-88fe-4683-93d6-06b632a0bcd8/volumes" Dec 03 14:36:25 crc kubenswrapper[4751]: I1203 14:36:25.419437 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:36:25 crc kubenswrapper[4751]: I1203 14:36:25.534015 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:26 crc kubenswrapper[4751]: I1203 14:36:26.471236 4751 generic.go:334] "Generic (PLEG): container finished" podID="b1846d92-9079-4edd-9c41-fdfba9354a2a" containerID="3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1" exitCode=0 Dec 03 14:36:26 crc kubenswrapper[4751]: I1203 14:36:26.471404 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" event={"ID":"b1846d92-9079-4edd-9c41-fdfba9354a2a","Type":"ContainerDied","Data":"3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1"} Dec 03 14:36:26 crc kubenswrapper[4751]: I1203 14:36:26.473046 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-n9kjc" event={"ID":"8729b9a9-11fa-432d-bc45-09172fc6bbc7","Type":"ContainerStarted","Data":"401be6a3772e336862f5c1b68804af0800810cfd0fa47538da48aad7379ec0b4"} Dec 03 14:36:26 crc kubenswrapper[4751]: I1203 14:36:26.477997 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2249aa73-cbf1-4a6b-a893-d2242c236c6d","Type":"ContainerStarted","Data":"795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b"} Dec 03 14:36:26 crc kubenswrapper[4751]: I1203 14:36:26.493299 4751 generic.go:334] "Generic (PLEG): container finished" podID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerID="2cd6e68e561871d96e82407b49eaa74d41888981969007b2a4ea1772ae66f465" exitCode=0 Dec 03 14:36:26 crc kubenswrapper[4751]: I1203 14:36:26.493706 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8svh4" event={"ID":"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd","Type":"ContainerDied","Data":"2cd6e68e561871d96e82407b49eaa74d41888981969007b2a4ea1772ae66f465"} Dec 03 14:36:26 crc kubenswrapper[4751]: I1203 14:36:26.525122 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-n9kjc" podStartSLOduration=5.525102706 podStartE2EDuration="5.525102706s" podCreationTimestamp="2025-12-03 14:36:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:26.512823331 +0000 UTC m=+1393.501178538" watchObservedRunningTime="2025-12-03 14:36:26.525102706 +0000 UTC m=+1393.513457913" Dec 03 14:36:27 crc kubenswrapper[4751]: I1203 14:36:27.731836 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:27 crc kubenswrapper[4751]: I1203 14:36:27.857382 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:29 crc kubenswrapper[4751]: I1203 14:36:29.869168 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dkx6d"] Dec 03 14:36:29 crc kubenswrapper[4751]: I1203 14:36:29.869603 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dkx6d" podUID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerName="registry-server" containerID="cri-o://4952abca2b7320f189e31ad0dd6a539cd7c0fca8cba3cfa6e05e2b7d46a9c878" gracePeriod=2 Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.546030 4751 generic.go:334] "Generic (PLEG): container finished" podID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerID="4952abca2b7320f189e31ad0dd6a539cd7c0fca8cba3cfa6e05e2b7d46a9c878" exitCode=0 Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.546106 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkx6d" event={"ID":"4241fb0b-9af8-430d-a244-84a16d0679e2","Type":"ContainerDied","Data":"4952abca2b7320f189e31ad0dd6a539cd7c0fca8cba3cfa6e05e2b7d46a9c878"} Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.548214 4751 generic.go:334] "Generic (PLEG): container finished" podID="8729b9a9-11fa-432d-bc45-09172fc6bbc7" containerID="401be6a3772e336862f5c1b68804af0800810cfd0fa47538da48aad7379ec0b4" exitCode=0 Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.548240 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-n9kjc" event={"ID":"8729b9a9-11fa-432d-bc45-09172fc6bbc7","Type":"ContainerDied","Data":"401be6a3772e336862f5c1b68804af0800810cfd0fa47538da48aad7379ec0b4"} Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.559534 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.576589 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b567465d6-ch8tf" Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.646410 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55f7778fd-nbkr2"] Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.646665 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55f7778fd-nbkr2" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api-log" containerID="cri-o://ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e" gracePeriod=30 Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.647002 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55f7778fd-nbkr2" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api" containerID="cri-o://e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea" gracePeriod=30 Dec 03 14:36:30 crc kubenswrapper[4751]: I1203 14:36:30.955386 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.120165 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-catalog-content\") pod \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.120279 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-utilities\") pod \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.120423 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf8dd\" (UniqueName: \"kubernetes.io/projected/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-kube-api-access-tf8dd\") pod \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\" (UID: \"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd\") " Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.125774 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-kube-api-access-tf8dd" (OuterVolumeSpecName: "kube-api-access-tf8dd") pod "ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" (UID: "ebf27f9b-b0e5-4b2a-af11-a6b6251374bd"). InnerVolumeSpecName "kube-api-access-tf8dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.125958 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-utilities" (OuterVolumeSpecName: "utilities") pod "ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" (UID: "ebf27f9b-b0e5-4b2a-af11-a6b6251374bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.149098 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" (UID: "ebf27f9b-b0e5-4b2a-af11-a6b6251374bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.223230 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.223260 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf8dd\" (UniqueName: \"kubernetes.io/projected/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-kube-api-access-tf8dd\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.223271 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.562065 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8svh4" event={"ID":"ebf27f9b-b0e5-4b2a-af11-a6b6251374bd","Type":"ContainerDied","Data":"bb25b274ede8ad8f40274483c4d892e45061d70c127a4b14d220849396d7aece"} Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.562615 4751 scope.go:117] "RemoveContainer" containerID="2cd6e68e561871d96e82407b49eaa74d41888981969007b2a4ea1772ae66f465" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.562583 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8svh4" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.567764 4751 generic.go:334] "Generic (PLEG): container finished" podID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerID="ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e" exitCode=143 Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.567889 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7778fd-nbkr2" event={"ID":"4b931e90-9037-4b52-92c1-9c1d4d3fbba4","Type":"ContainerDied","Data":"ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e"} Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.596189 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8svh4"] Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.609978 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8svh4"] Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.764160 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.919251 4751 scope.go:117] "RemoveContainer" containerID="a27d8a74d017a8dd04a9d7f039b9182d4448b513b259336e7897cfd618ee8bda" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.936802 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqssn\" (UniqueName: \"kubernetes.io/projected/4241fb0b-9af8-430d-a244-84a16d0679e2-kube-api-access-zqssn\") pod \"4241fb0b-9af8-430d-a244-84a16d0679e2\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.936954 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-utilities\") pod \"4241fb0b-9af8-430d-a244-84a16d0679e2\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.937014 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-catalog-content\") pod \"4241fb0b-9af8-430d-a244-84a16d0679e2\" (UID: \"4241fb0b-9af8-430d-a244-84a16d0679e2\") " Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.937810 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-utilities" (OuterVolumeSpecName: "utilities") pod "4241fb0b-9af8-430d-a244-84a16d0679e2" (UID: "4241fb0b-9af8-430d-a244-84a16d0679e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.942185 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4241fb0b-9af8-430d-a244-84a16d0679e2-kube-api-access-zqssn" (OuterVolumeSpecName: "kube-api-access-zqssn") pod "4241fb0b-9af8-430d-a244-84a16d0679e2" (UID: "4241fb0b-9af8-430d-a244-84a16d0679e2"). InnerVolumeSpecName "kube-api-access-zqssn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:31 crc kubenswrapper[4751]: I1203 14:36:31.987104 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4241fb0b-9af8-430d-a244-84a16d0679e2" (UID: "4241fb0b-9af8-430d-a244-84a16d0679e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.038727 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.038754 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqssn\" (UniqueName: \"kubernetes.io/projected/4241fb0b-9af8-430d-a244-84a16d0679e2-kube-api-access-zqssn\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.038766 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4241fb0b-9af8-430d-a244-84a16d0679e2-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.040679 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.055166 4751 scope.go:117] "RemoveContainer" containerID="d66a018b9aa11ce7875e010546612fdc17aab858f72aa637ba7683da83b8a777" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.143938 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-config-data\") pod \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.143984 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn6rw\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-kube-api-access-nn6rw\") pod \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.144004 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-combined-ca-bundle\") pod \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.144245 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-scripts\") pod \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.144268 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-certs\") pod \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\" (UID: \"8729b9a9-11fa-432d-bc45-09172fc6bbc7\") " Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.152220 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-scripts" (OuterVolumeSpecName: "scripts") pod "8729b9a9-11fa-432d-bc45-09172fc6bbc7" (UID: "8729b9a9-11fa-432d-bc45-09172fc6bbc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.158785 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-kube-api-access-nn6rw" (OuterVolumeSpecName: "kube-api-access-nn6rw") pod "8729b9a9-11fa-432d-bc45-09172fc6bbc7" (UID: "8729b9a9-11fa-432d-bc45-09172fc6bbc7"). InnerVolumeSpecName "kube-api-access-nn6rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.159444 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-certs" (OuterVolumeSpecName: "certs") pod "8729b9a9-11fa-432d-bc45-09172fc6bbc7" (UID: "8729b9a9-11fa-432d-bc45-09172fc6bbc7"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.188480 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8729b9a9-11fa-432d-bc45-09172fc6bbc7" (UID: "8729b9a9-11fa-432d-bc45-09172fc6bbc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.200498 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-config-data" (OuterVolumeSpecName: "config-data") pod "8729b9a9-11fa-432d-bc45-09172fc6bbc7" (UID: "8729b9a9-11fa-432d-bc45-09172fc6bbc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.246670 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.247006 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn6rw\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-kube-api-access-nn6rw\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.247021 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.247031 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8729b9a9-11fa-432d-bc45-09172fc6bbc7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.247038 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8729b9a9-11fa-432d-bc45-09172fc6bbc7-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.591800 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" event={"ID":"b1846d92-9079-4edd-9c41-fdfba9354a2a","Type":"ContainerStarted","Data":"9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e"} Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.592451 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.597260 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-n9kjc" event={"ID":"8729b9a9-11fa-432d-bc45-09172fc6bbc7","Type":"ContainerDied","Data":"4e9e928ddfecf73d9d2071ffd53ed479c47069cbba3c473650e74130322b7ba9"} Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.597312 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e9e928ddfecf73d9d2071ffd53ed479c47069cbba3c473650e74130322b7ba9" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.597496 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-n9kjc" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.612043 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2249aa73-cbf1-4a6b-a893-d2242c236c6d","Type":"ContainerStarted","Data":"d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df"} Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.612231 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerName="cinder-api-log" containerID="cri-o://795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b" gracePeriod=30 Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.612768 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.612838 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerName="cinder-api" containerID="cri-o://d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df" gracePeriod=30 Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.622575 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" podStartSLOduration=10.622555983 podStartE2EDuration="10.622555983s" podCreationTimestamp="2025-12-03 14:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:32.613691362 +0000 UTC m=+1399.602046579" watchObservedRunningTime="2025-12-03 14:36:32.622555983 +0000 UTC m=+1399.610911200" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.627806 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkx6d" event={"ID":"4241fb0b-9af8-430d-a244-84a16d0679e2","Type":"ContainerDied","Data":"4e86b206bfea513c6a48a03ed016075e965a0cd75c4eb14b3f1a5a4d2e9b71ee"} Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.627851 4751 scope.go:117] "RemoveContainer" containerID="4952abca2b7320f189e31ad0dd6a539cd7c0fca8cba3cfa6e05e2b7d46a9c878" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.627950 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkx6d" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.649930 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=10.649913369 podStartE2EDuration="10.649913369s" podCreationTimestamp="2025-12-03 14:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:32.638837127 +0000 UTC m=+1399.627192354" watchObservedRunningTime="2025-12-03 14:36:32.649913369 +0000 UTC m=+1399.638268586" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.672437 4751 generic.go:334] "Generic (PLEG): container finished" podID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerID="a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809" exitCode=0 Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.672780 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmdtc" event={"ID":"3506627c-9636-4af8-ad7f-78db94f0ff11","Type":"ContainerDied","Data":"a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809"} Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.705637 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dkx6d"] Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.710977 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerStarted","Data":"fe8ff0e26cd525fa86824f7598fcba8f68c5a9fe6e6cb265b596ba547f21cbdd"} Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.711142 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="ceilometer-central-agent" containerID="cri-o://76e8fcf8f19c4ff5ea6a00e922893fc2693c320e5ba8e08a5908d288add03d4f" gracePeriod=30 Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.711241 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.711279 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="proxy-httpd" containerID="cri-o://fe8ff0e26cd525fa86824f7598fcba8f68c5a9fe6e6cb265b596ba547f21cbdd" gracePeriod=30 Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.711497 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="ceilometer-notification-agent" containerID="cri-o://02f12f7ea14c7367523ffa5e2fa5161c5c6dd54d4b09459fdbdce4fb40524372" gracePeriod=30 Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.711321 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="sg-core" containerID="cri-o://14f04502b0f362737565c4c435a8553a96ea535887b091113225455159452a02" gracePeriod=30 Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.725775 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dkx6d"] Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.754361 4751 scope.go:117] "RemoveContainer" containerID="b8cf759ce2b56229818e53e6bdf44f78ad50e3cfc5557e1fa89f1d693f981415" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.793381 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:36:32 crc kubenswrapper[4751]: E1203 14:36:32.794106 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerName="extract-content" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794119 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerName="extract-content" Dec 03 14:36:32 crc kubenswrapper[4751]: E1203 14:36:32.794134 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerName="extract-utilities" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794141 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerName="extract-utilities" Dec 03 14:36:32 crc kubenswrapper[4751]: E1203 14:36:32.794160 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6418aa65-88fe-4683-93d6-06b632a0bcd8" containerName="dnsmasq-dns" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794166 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6418aa65-88fe-4683-93d6-06b632a0bcd8" containerName="dnsmasq-dns" Dec 03 14:36:32 crc kubenswrapper[4751]: E1203 14:36:32.794181 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6418aa65-88fe-4683-93d6-06b632a0bcd8" containerName="init" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794187 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6418aa65-88fe-4683-93d6-06b632a0bcd8" containerName="init" Dec 03 14:36:32 crc kubenswrapper[4751]: E1203 14:36:32.794201 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerName="extract-utilities" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794208 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerName="extract-utilities" Dec 03 14:36:32 crc kubenswrapper[4751]: E1203 14:36:32.794220 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerName="registry-server" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794225 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerName="registry-server" Dec 03 14:36:32 crc kubenswrapper[4751]: E1203 14:36:32.794233 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8729b9a9-11fa-432d-bc45-09172fc6bbc7" containerName="cloudkitty-storageinit" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794240 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8729b9a9-11fa-432d-bc45-09172fc6bbc7" containerName="cloudkitty-storageinit" Dec 03 14:36:32 crc kubenswrapper[4751]: E1203 14:36:32.794250 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerName="registry-server" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794256 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerName="registry-server" Dec 03 14:36:32 crc kubenswrapper[4751]: E1203 14:36:32.794279 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerName="extract-content" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794285 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerName="extract-content" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794602 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6418aa65-88fe-4683-93d6-06b632a0bcd8" containerName="dnsmasq-dns" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794626 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8729b9a9-11fa-432d-bc45-09172fc6bbc7" containerName="cloudkitty-storageinit" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794637 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" containerName="registry-server" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.794644 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4241fb0b-9af8-430d-a244-84a16d0679e2" containerName="registry-server" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.795405 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.815569 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.815781 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.815880 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.815898 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-mdxtz" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.816341 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.816458 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.832486 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.369823455 podStartE2EDuration="1m24.832463915s" podCreationTimestamp="2025-12-03 14:35:08 +0000 UTC" firstStartedPulling="2025-12-03 14:35:10.519638941 +0000 UTC m=+1317.507994158" lastFinishedPulling="2025-12-03 14:36:31.982279391 +0000 UTC m=+1398.970634618" observedRunningTime="2025-12-03 14:36:32.787986132 +0000 UTC m=+1399.776341349" watchObservedRunningTime="2025-12-03 14:36:32.832463915 +0000 UTC m=+1399.820819142" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.923489 4751 scope.go:117] "RemoveContainer" containerID="e1837ff5c8ea3f9b5d5627900a3591bb075ffb12d0114d378fd2795158400fb6" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.972533 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.972610 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.972692 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.972718 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-scripts\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.972770 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x22k\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-kube-api-access-5x22k\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:32 crc kubenswrapper[4751]: I1203 14:36:32.972807 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-certs\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.021469 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-j8j2p"] Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.029736 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-28zsl"] Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.033949 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.053908 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-28zsl"] Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.077805 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.077861 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.077922 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.077943 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-scripts\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.077985 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x22k\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-kube-api-access-5x22k\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.078001 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-certs\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.090164 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.091391 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.094428 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-scripts\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.095041 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-certs\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.102255 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.104670 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x22k\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-kube-api-access-5x22k\") pod \"cloudkitty-proc-0\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.166492 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.168821 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.173948 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.179451 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-sb\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.179558 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-config\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.179587 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-svc\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.179614 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-nb\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.179633 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-swift-storage-0\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.179698 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brv2z\" (UniqueName: \"kubernetes.io/projected/f7f79d9f-538b-49fc-858d-169a04c6819e-kube-api-access-brv2z\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.203735 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.276939 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.281465 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.281578 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brv2z\" (UniqueName: \"kubernetes.io/projected/f7f79d9f-538b-49fc-858d-169a04c6819e-kube-api-access-brv2z\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.281618 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-certs\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.281659 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-sb\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.281751 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-logs\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.281832 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.282011 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.282106 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6cr9\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-kube-api-access-x6cr9\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.282137 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-config\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.282186 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-svc\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.282243 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-nb\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.282273 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-scripts\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.282303 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-swift-storage-0\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.282531 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-sb\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.282965 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-svc\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.283026 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-config\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.283151 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-nb\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.283271 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-swift-storage-0\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.300516 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brv2z\" (UniqueName: \"kubernetes.io/projected/f7f79d9f-538b-49fc-858d-169a04c6819e-kube-api-access-brv2z\") pod \"dnsmasq-dns-58bd69657f-28zsl\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.341173 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4241fb0b-9af8-430d-a244-84a16d0679e2" path="/var/lib/kubelet/pods/4241fb0b-9af8-430d-a244-84a16d0679e2/volumes" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.344983 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf27f9b-b0e5-4b2a-af11-a6b6251374bd" path="/var/lib/kubelet/pods/ebf27f9b-b0e5-4b2a-af11-a6b6251374bd/volumes" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.358570 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.384387 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-scripts\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.386239 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.386673 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-certs\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.386787 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-logs\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.386854 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.386940 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.386990 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6cr9\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-kube-api-access-x6cr9\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.388119 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-logs\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.390884 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.394008 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-scripts\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.396062 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.398819 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.399862 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-certs\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.403969 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6cr9\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-kube-api-access-x6cr9\") pod \"cloudkitty-api-0\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.518800 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.792049 4751 generic.go:334] "Generic (PLEG): container finished" podID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerID="fe8ff0e26cd525fa86824f7598fcba8f68c5a9fe6e6cb265b596ba547f21cbdd" exitCode=0 Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.792282 4751 generic.go:334] "Generic (PLEG): container finished" podID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerID="14f04502b0f362737565c4c435a8553a96ea535887b091113225455159452a02" exitCode=2 Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.792290 4751 generic.go:334] "Generic (PLEG): container finished" podID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerID="02f12f7ea14c7367523ffa5e2fa5161c5c6dd54d4b09459fdbdce4fb40524372" exitCode=0 Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.792296 4751 generic.go:334] "Generic (PLEG): container finished" podID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerID="76e8fcf8f19c4ff5ea6a00e922893fc2693c320e5ba8e08a5908d288add03d4f" exitCode=0 Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.792363 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerDied","Data":"fe8ff0e26cd525fa86824f7598fcba8f68c5a9fe6e6cb265b596ba547f21cbdd"} Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.792393 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerDied","Data":"14f04502b0f362737565c4c435a8553a96ea535887b091113225455159452a02"} Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.792405 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerDied","Data":"02f12f7ea14c7367523ffa5e2fa5161c5c6dd54d4b09459fdbdce4fb40524372"} Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.792412 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerDied","Data":"76e8fcf8f19c4ff5ea6a00e922893fc2693c320e5ba8e08a5908d288add03d4f"} Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.796918 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f903006b-ff42-40bb-9572-6e907afa1f0e","Type":"ContainerStarted","Data":"d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292"} Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.810782 4751 generic.go:334] "Generic (PLEG): container finished" podID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerID="795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b" exitCode=143 Dec 03 14:36:33 crc kubenswrapper[4751]: I1203 14:36:33.810858 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2249aa73-cbf1-4a6b-a893-d2242c236c6d","Type":"ContainerDied","Data":"795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b"} Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.080012 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55f7778fd-nbkr2" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:43334->10.217.0.179:9311: read: connection reset by peer" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.080154 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55f7778fd-nbkr2" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:43336->10.217.0.179:9311: read: connection reset by peer" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.254540 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:36:34 crc kubenswrapper[4751]: W1203 14:36:34.299308 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3d433ac_6236_4b80_96f5_9047aed57c8f.slice/crio-30f9ff8dceb8dab824c95ceb9995dde17ef13a972500cbeb61e87695f62007e2 WatchSource:0}: Error finding container 30f9ff8dceb8dab824c95ceb9995dde17ef13a972500cbeb61e87695f62007e2: Status 404 returned error can't find the container with id 30f9ff8dceb8dab824c95ceb9995dde17ef13a972500cbeb61e87695f62007e2 Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.404913 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.407090 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.525212 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-scripts\") pod \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.525394 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-run-httpd\") pod \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.525432 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-sg-core-conf-yaml\") pod \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.525554 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-combined-ca-bundle\") pod \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.525601 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-log-httpd\") pod \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.525636 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckrsx\" (UniqueName: \"kubernetes.io/projected/7e2515a0-a5d9-4c4a-b686-1d011708c96d-kube-api-access-ckrsx\") pod \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.525735 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-config-data\") pod \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\" (UID: \"7e2515a0-a5d9-4c4a-b686-1d011708c96d\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.527716 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e2515a0-a5d9-4c4a-b686-1d011708c96d" (UID: "7e2515a0-a5d9-4c4a-b686-1d011708c96d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.528035 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e2515a0-a5d9-4c4a-b686-1d011708c96d" (UID: "7e2515a0-a5d9-4c4a-b686-1d011708c96d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.553480 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2515a0-a5d9-4c4a-b686-1d011708c96d-kube-api-access-ckrsx" (OuterVolumeSpecName: "kube-api-access-ckrsx") pod "7e2515a0-a5d9-4c4a-b686-1d011708c96d" (UID: "7e2515a0-a5d9-4c4a-b686-1d011708c96d"). InnerVolumeSpecName "kube-api-access-ckrsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.580122 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-scripts" (OuterVolumeSpecName: "scripts") pod "7e2515a0-a5d9-4c4a-b686-1d011708c96d" (UID: "7e2515a0-a5d9-4c4a-b686-1d011708c96d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.585171 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e2515a0-a5d9-4c4a-b686-1d011708c96d" (UID: "7e2515a0-a5d9-4c4a-b686-1d011708c96d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.607304 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-28zsl"] Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.629524 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.629548 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckrsx\" (UniqueName: \"kubernetes.io/projected/7e2515a0-a5d9-4c4a-b686-1d011708c96d-kube-api-access-ckrsx\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.629562 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.629573 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e2515a0-a5d9-4c4a-b686-1d011708c96d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.629598 4751 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:34 crc kubenswrapper[4751]: W1203 14:36:34.634489 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7f79d9f_538b_49fc_858d_169a04c6819e.slice/crio-d171b742d978f32332d8ef45ac8390aff21d3d44174bd25f46b897ab5b1ee7c7 WatchSource:0}: Error finding container d171b742d978f32332d8ef45ac8390aff21d3d44174bd25f46b897ab5b1ee7c7: Status 404 returned error can't find the container with id d171b742d978f32332d8ef45ac8390aff21d3d44174bd25f46b897ab5b1ee7c7 Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.829605 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e2515a0-a5d9-4c4a-b686-1d011708c96d" (UID: "7e2515a0-a5d9-4c4a-b686-1d011708c96d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.838665 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.853492 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.872880 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e5525ea2-aea2-4432-9ba6-db64b2b2f3da","Type":"ContainerStarted","Data":"005633cfa5919c5f9ac3650de6e2ce8a06eac7b5cd0cecfb58ae492a0c366e01"} Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.872927 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e5525ea2-aea2-4432-9ba6-db64b2b2f3da","Type":"ContainerStarted","Data":"647721f8ec01ae8dc8457ce7ef851e0fc21d0e23405d0a7acfefab35c76085ef"} Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.878088 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"c3d433ac-6236-4b80-96f5-9047aed57c8f","Type":"ContainerStarted","Data":"30f9ff8dceb8dab824c95ceb9995dde17ef13a972500cbeb61e87695f62007e2"} Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.887087 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" event={"ID":"f7f79d9f-538b-49fc-858d-169a04c6819e","Type":"ContainerStarted","Data":"d171b742d978f32332d8ef45ac8390aff21d3d44174bd25f46b897ab5b1ee7c7"} Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.895573 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-config-data" (OuterVolumeSpecName: "config-data") pod "7e2515a0-a5d9-4c4a-b686-1d011708c96d" (UID: "7e2515a0-a5d9-4c4a-b686-1d011708c96d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.897583 4751 generic.go:334] "Generic (PLEG): container finished" podID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerID="e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea" exitCode=0 Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.897654 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7778fd-nbkr2" event={"ID":"4b931e90-9037-4b52-92c1-9c1d4d3fbba4","Type":"ContainerDied","Data":"e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea"} Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.897682 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55f7778fd-nbkr2" event={"ID":"4b931e90-9037-4b52-92c1-9c1d4d3fbba4","Type":"ContainerDied","Data":"d0386b5b1cddd1418a734b8b721557b2e285618925d375a6474523cdecf63d70"} Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.897704 4751 scope.go:117] "RemoveContainer" containerID="e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.898693 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.906725 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f903006b-ff42-40bb-9572-6e907afa1f0e","Type":"ContainerStarted","Data":"e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb"} Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.937533 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmdtc" event={"ID":"3506627c-9636-4af8-ad7f-78db94f0ff11","Type":"ContainerStarted","Data":"a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c"} Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.942448 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.609360163 podStartE2EDuration="12.942426835s" podCreationTimestamp="2025-12-03 14:36:22 +0000 UTC" firstStartedPulling="2025-12-03 14:36:23.650685689 +0000 UTC m=+1390.639040906" lastFinishedPulling="2025-12-03 14:36:31.983752361 +0000 UTC m=+1398.972107578" observedRunningTime="2025-12-03 14:36:34.938687123 +0000 UTC m=+1401.927042340" watchObservedRunningTime="2025-12-03 14:36:34.942426835 +0000 UTC m=+1401.930782052" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.943297 4751 scope.go:117] "RemoveContainer" containerID="ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.954772 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data\") pod \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.954839 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnskl\" (UniqueName: \"kubernetes.io/projected/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-kube-api-access-jnskl\") pod \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.955010 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data-custom\") pod \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.955828 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-combined-ca-bundle\") pod \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.956155 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-logs\") pod \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\" (UID: \"4b931e90-9037-4b52-92c1-9c1d4d3fbba4\") " Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.957430 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2515a0-a5d9-4c4a-b686-1d011708c96d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.959719 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-logs" (OuterVolumeSpecName: "logs") pod "4b931e90-9037-4b52-92c1-9c1d4d3fbba4" (UID: "4b931e90-9037-4b52-92c1-9c1d4d3fbba4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.963478 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4b931e90-9037-4b52-92c1-9c1d4d3fbba4" (UID: "4b931e90-9037-4b52-92c1-9c1d4d3fbba4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.970433 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-kube-api-access-jnskl" (OuterVolumeSpecName: "kube-api-access-jnskl") pod "4b931e90-9037-4b52-92c1-9c1d4d3fbba4" (UID: "4b931e90-9037-4b52-92c1-9c1d4d3fbba4"). InnerVolumeSpecName "kube-api-access-jnskl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.988594 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" podUID="b1846d92-9079-4edd-9c41-fdfba9354a2a" containerName="dnsmasq-dns" containerID="cri-o://9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e" gracePeriod=10 Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.988874 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:36:34 crc kubenswrapper[4751]: I1203 14:36:34.991586 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e2515a0-a5d9-4c4a-b686-1d011708c96d","Type":"ContainerDied","Data":"fbdc647375981f62a924f17bba262778186a81edfed0b16f453ca52d691a6bb9"} Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.009577 4751 scope.go:117] "RemoveContainer" containerID="e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea" Dec 03 14:36:35 crc kubenswrapper[4751]: E1203 14:36:35.010001 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea\": container with ID starting with e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea not found: ID does not exist" containerID="e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.010052 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea"} err="failed to get container status \"e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea\": rpc error: code = NotFound desc = could not find container \"e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea\": container with ID starting with e5aaf58d350a58f3a1d8a7ed63551afb0ad32cfe850c379ec9ce8a14810b1cea not found: ID does not exist" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.010074 4751 scope.go:117] "RemoveContainer" containerID="ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e" Dec 03 14:36:35 crc kubenswrapper[4751]: E1203 14:36:35.010610 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e\": container with ID starting with ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e not found: ID does not exist" containerID="ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.010633 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e"} err="failed to get container status \"ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e\": rpc error: code = NotFound desc = could not find container \"ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e\": container with ID starting with ffe2d23574e0cafd247c435b33bc2a3a942d895a052dce1be407cd41facd979e not found: ID does not exist" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.010647 4751 scope.go:117] "RemoveContainer" containerID="fe8ff0e26cd525fa86824f7598fcba8f68c5a9fe6e6cb265b596ba547f21cbdd" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.023772 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b931e90-9037-4b52-92c1-9c1d4d3fbba4" (UID: "4b931e90-9037-4b52-92c1-9c1d4d3fbba4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.056118 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data" (OuterVolumeSpecName: "config-data") pod "4b931e90-9037-4b52-92c1-9c1d4d3fbba4" (UID: "4b931e90-9037-4b52-92c1-9c1d4d3fbba4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.058817 4751 scope.go:117] "RemoveContainer" containerID="14f04502b0f362737565c4c435a8553a96ea535887b091113225455159452a02" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.059066 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.059103 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnskl\" (UniqueName: \"kubernetes.io/projected/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-kube-api-access-jnskl\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.059116 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.059127 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.059135 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b931e90-9037-4b52-92c1-9c1d4d3fbba4-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.119087 4751 scope.go:117] "RemoveContainer" containerID="02f12f7ea14c7367523ffa5e2fa5161c5c6dd54d4b09459fdbdce4fb40524372" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.126860 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tmdtc" podStartSLOduration=3.853575558 podStartE2EDuration="15.126836572s" podCreationTimestamp="2025-12-03 14:36:20 +0000 UTC" firstStartedPulling="2025-12-03 14:36:22.301999347 +0000 UTC m=+1389.290354564" lastFinishedPulling="2025-12-03 14:36:33.575260361 +0000 UTC m=+1400.563615578" observedRunningTime="2025-12-03 14:36:34.982353654 +0000 UTC m=+1401.970708901" watchObservedRunningTime="2025-12-03 14:36:35.126836572 +0000 UTC m=+1402.115191789" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.134703 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.153061 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171107 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:36:35 crc kubenswrapper[4751]: E1203 14:36:35.171594 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="ceilometer-notification-agent" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171607 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="ceilometer-notification-agent" Dec 03 14:36:35 crc kubenswrapper[4751]: E1203 14:36:35.171616 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171622 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api" Dec 03 14:36:35 crc kubenswrapper[4751]: E1203 14:36:35.171634 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="proxy-httpd" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171640 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="proxy-httpd" Dec 03 14:36:35 crc kubenswrapper[4751]: E1203 14:36:35.171663 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="sg-core" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171670 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="sg-core" Dec 03 14:36:35 crc kubenswrapper[4751]: E1203 14:36:35.171681 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api-log" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171687 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api-log" Dec 03 14:36:35 crc kubenswrapper[4751]: E1203 14:36:35.171699 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="ceilometer-central-agent" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171705 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="ceilometer-central-agent" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171879 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api-log" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171895 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="ceilometer-notification-agent" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171907 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="sg-core" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171919 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" containerName="barbican-api" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171932 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="ceilometer-central-agent" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.171947 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" containerName="proxy-httpd" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.173955 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.186727 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.187593 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.187687 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.199560 4751 scope.go:117] "RemoveContainer" containerID="76e8fcf8f19c4ff5ea6a00e922893fc2693c320e5ba8e08a5908d288add03d4f" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.271479 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-log-httpd\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.271804 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.271840 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-config-data\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.271926 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-run-httpd\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.271948 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-scripts\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.271969 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.271996 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96vcf\" (UniqueName: \"kubernetes.io/projected/b0cff45f-8db2-44d7-9723-677508c4c442-kube-api-access-96vcf\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.328031 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2515a0-a5d9-4c4a-b686-1d011708c96d" path="/var/lib/kubelet/pods/7e2515a0-a5d9-4c4a-b686-1d011708c96d/volumes" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.375315 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-log-httpd\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.375417 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.375457 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-config-data\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.375479 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-run-httpd\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.375499 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-scripts\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.375535 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.375577 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96vcf\" (UniqueName: \"kubernetes.io/projected/b0cff45f-8db2-44d7-9723-677508c4c442-kube-api-access-96vcf\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.379173 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-log-httpd\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.379236 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-run-httpd\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.384462 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-scripts\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.386018 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.387649 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-config-data\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.387828 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.398147 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96vcf\" (UniqueName: \"kubernetes.io/projected/b0cff45f-8db2-44d7-9723-677508c4c442-kube-api-access-96vcf\") pod \"ceilometer-0\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.508158 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.723399 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.784920 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft5lv\" (UniqueName: \"kubernetes.io/projected/b1846d92-9079-4edd-9c41-fdfba9354a2a-kube-api-access-ft5lv\") pod \"b1846d92-9079-4edd-9c41-fdfba9354a2a\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.785152 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-svc\") pod \"b1846d92-9079-4edd-9c41-fdfba9354a2a\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.785179 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-nb\") pod \"b1846d92-9079-4edd-9c41-fdfba9354a2a\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.785239 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-sb\") pod \"b1846d92-9079-4edd-9c41-fdfba9354a2a\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.785368 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-config\") pod \"b1846d92-9079-4edd-9c41-fdfba9354a2a\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.785455 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-swift-storage-0\") pod \"b1846d92-9079-4edd-9c41-fdfba9354a2a\" (UID: \"b1846d92-9079-4edd-9c41-fdfba9354a2a\") " Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.802129 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1846d92-9079-4edd-9c41-fdfba9354a2a-kube-api-access-ft5lv" (OuterVolumeSpecName: "kube-api-access-ft5lv") pod "b1846d92-9079-4edd-9c41-fdfba9354a2a" (UID: "b1846d92-9079-4edd-9c41-fdfba9354a2a"). InnerVolumeSpecName "kube-api-access-ft5lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.870067 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-config" (OuterVolumeSpecName: "config") pod "b1846d92-9079-4edd-9c41-fdfba9354a2a" (UID: "b1846d92-9079-4edd-9c41-fdfba9354a2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.871249 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b1846d92-9079-4edd-9c41-fdfba9354a2a" (UID: "b1846d92-9079-4edd-9c41-fdfba9354a2a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.873868 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1846d92-9079-4edd-9c41-fdfba9354a2a" (UID: "b1846d92-9079-4edd-9c41-fdfba9354a2a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.887853 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.887922 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft5lv\" (UniqueName: \"kubernetes.io/projected/b1846d92-9079-4edd-9c41-fdfba9354a2a-kube-api-access-ft5lv\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.887933 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.887942 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.893070 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1846d92-9079-4edd-9c41-fdfba9354a2a" (UID: "b1846d92-9079-4edd-9c41-fdfba9354a2a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.904856 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1846d92-9079-4edd-9c41-fdfba9354a2a" (UID: "b1846d92-9079-4edd-9c41-fdfba9354a2a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.990588 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:35 crc kubenswrapper[4751]: I1203 14:36:35.990620 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1846d92-9079-4edd-9c41-fdfba9354a2a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.009923 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e5525ea2-aea2-4432-9ba6-db64b2b2f3da","Type":"ContainerStarted","Data":"903b182090200583941ca11a71a154606fe827131614a059f77636acbb841198"} Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.010279 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.012138 4751 generic.go:334] "Generic (PLEG): container finished" podID="f7f79d9f-538b-49fc-858d-169a04c6819e" containerID="27841f4fb675967760579ec19a6f06485f884e280b1acff64e5af196383c0698" exitCode=0 Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.012185 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" event={"ID":"f7f79d9f-538b-49fc-858d-169a04c6819e","Type":"ContainerDied","Data":"27841f4fb675967760579ec19a6f06485f884e280b1acff64e5af196383c0698"} Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.018409 4751 generic.go:334] "Generic (PLEG): container finished" podID="b1846d92-9079-4edd-9c41-fdfba9354a2a" containerID="9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e" exitCode=0 Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.018656 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.018628 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" event={"ID":"b1846d92-9079-4edd-9c41-fdfba9354a2a","Type":"ContainerDied","Data":"9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e"} Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.018832 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-j8j2p" event={"ID":"b1846d92-9079-4edd-9c41-fdfba9354a2a","Type":"ContainerDied","Data":"8405fb6e7868a07f7fdb87d891a13004adeea73c99d6cd47872e0eb329a8eadb"} Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.018850 4751 scope.go:117] "RemoveContainer" containerID="9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e" Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.033136 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.033117614 podStartE2EDuration="3.033117614s" podCreationTimestamp="2025-12-03 14:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:36.031662455 +0000 UTC m=+1403.020017682" watchObservedRunningTime="2025-12-03 14:36:36.033117614 +0000 UTC m=+1403.021472831" Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.088881 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-j8j2p"] Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.113239 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-j8j2p"] Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.163703 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.296877 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.503429 4751 scope.go:117] "RemoveContainer" containerID="3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1" Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.697659 4751 scope.go:117] "RemoveContainer" containerID="9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e" Dec 03 14:36:36 crc kubenswrapper[4751]: E1203 14:36:36.698133 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e\": container with ID starting with 9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e not found: ID does not exist" containerID="9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e" Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.698173 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e"} err="failed to get container status \"9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e\": rpc error: code = NotFound desc = could not find container \"9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e\": container with ID starting with 9fa0d9eeba071d185f96913a76b446e1994cdd1842793eeab9e0cf97d3a3c92e not found: ID does not exist" Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.698216 4751 scope.go:117] "RemoveContainer" containerID="3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1" Dec 03 14:36:36 crc kubenswrapper[4751]: E1203 14:36:36.698462 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1\": container with ID starting with 3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1 not found: ID does not exist" containerID="3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1" Dec 03 14:36:36 crc kubenswrapper[4751]: I1203 14:36:36.698503 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1"} err="failed to get container status \"3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1\": rpc error: code = NotFound desc = could not find container \"3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1\": container with ID starting with 3425e72cc4c0ecc76b0e13930d437b6b47454c0065b39adaef7423c4f7d766d1 not found: ID does not exist" Dec 03 14:36:37 crc kubenswrapper[4751]: I1203 14:36:37.029084 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"c3d433ac-6236-4b80-96f5-9047aed57c8f","Type":"ContainerStarted","Data":"e099f63df6df1705cdfe575bd5adc863e392aac4ccd1fa4f0469f71e130facad"} Dec 03 14:36:37 crc kubenswrapper[4751]: I1203 14:36:37.031551 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" event={"ID":"f7f79d9f-538b-49fc-858d-169a04c6819e","Type":"ContainerStarted","Data":"2e4d51d2b8775307fa843835316554a7f0416e9814368080a84097f53368e022"} Dec 03 14:36:37 crc kubenswrapper[4751]: I1203 14:36:37.031851 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:37 crc kubenswrapper[4751]: I1203 14:36:37.034791 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerStarted","Data":"e45df8f81575c2844bd3abf9a0808a8b86a85ed91293d87fed92f5e964e091ee"} Dec 03 14:36:37 crc kubenswrapper[4751]: I1203 14:36:37.049839 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.792912621 podStartE2EDuration="5.049819177s" podCreationTimestamp="2025-12-03 14:36:32 +0000 UTC" firstStartedPulling="2025-12-03 14:36:34.310904173 +0000 UTC m=+1401.299259390" lastFinishedPulling="2025-12-03 14:36:36.567810729 +0000 UTC m=+1403.556165946" observedRunningTime="2025-12-03 14:36:37.043218087 +0000 UTC m=+1404.031573304" watchObservedRunningTime="2025-12-03 14:36:37.049819177 +0000 UTC m=+1404.038174394" Dec 03 14:36:37 crc kubenswrapper[4751]: I1203 14:36:37.080198 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:36:37 crc kubenswrapper[4751]: I1203 14:36:37.090114 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" podStartSLOduration=5.090095985 podStartE2EDuration="5.090095985s" podCreationTimestamp="2025-12-03 14:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:37.066489851 +0000 UTC m=+1404.054845078" watchObservedRunningTime="2025-12-03 14:36:37.090095985 +0000 UTC m=+1404.078451202" Dec 03 14:36:37 crc kubenswrapper[4751]: I1203 14:36:37.326771 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1846d92-9079-4edd-9c41-fdfba9354a2a" path="/var/lib/kubelet/pods/b1846d92-9079-4edd-9c41-fdfba9354a2a/volumes" Dec 03 14:36:38 crc kubenswrapper[4751]: I1203 14:36:38.061960 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerName="cloudkitty-api-log" containerID="cri-o://005633cfa5919c5f9ac3650de6e2ce8a06eac7b5cd0cecfb58ae492a0c366e01" gracePeriod=30 Dec 03 14:36:38 crc kubenswrapper[4751]: I1203 14:36:38.062355 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerStarted","Data":"24003421e57923c6199a8e1dfb412c6d702c4649da37c29574a520e9fa6f3964"} Dec 03 14:36:38 crc kubenswrapper[4751]: I1203 14:36:38.062380 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerStarted","Data":"140d4f247f184cab56abb6c9a63b668e98b54b136f9c22a120bc01761693764f"} Dec 03 14:36:38 crc kubenswrapper[4751]: I1203 14:36:38.062766 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerName="cloudkitty-api" containerID="cri-o://903b182090200583941ca11a71a154606fe827131614a059f77636acbb841198" gracePeriod=30 Dec 03 14:36:38 crc kubenswrapper[4751]: I1203 14:36:38.099655 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 14:36:38 crc kubenswrapper[4751]: I1203 14:36:38.441837 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 14:36:39 crc kubenswrapper[4751]: I1203 14:36:39.072913 4751 generic.go:334] "Generic (PLEG): container finished" podID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerID="903b182090200583941ca11a71a154606fe827131614a059f77636acbb841198" exitCode=0 Dec 03 14:36:39 crc kubenswrapper[4751]: I1203 14:36:39.073224 4751 generic.go:334] "Generic (PLEG): container finished" podID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerID="005633cfa5919c5f9ac3650de6e2ce8a06eac7b5cd0cecfb58ae492a0c366e01" exitCode=143 Dec 03 14:36:39 crc kubenswrapper[4751]: I1203 14:36:39.072980 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e5525ea2-aea2-4432-9ba6-db64b2b2f3da","Type":"ContainerDied","Data":"903b182090200583941ca11a71a154606fe827131614a059f77636acbb841198"} Dec 03 14:36:39 crc kubenswrapper[4751]: I1203 14:36:39.073289 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e5525ea2-aea2-4432-9ba6-db64b2b2f3da","Type":"ContainerDied","Data":"005633cfa5919c5f9ac3650de6e2ce8a06eac7b5cd0cecfb58ae492a0c366e01"} Dec 03 14:36:39 crc kubenswrapper[4751]: I1203 14:36:39.073504 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="c3d433ac-6236-4b80-96f5-9047aed57c8f" containerName="cloudkitty-proc" containerID="cri-o://e099f63df6df1705cdfe575bd5adc863e392aac4ccd1fa4f0469f71e130facad" gracePeriod=30 Dec 03 14:36:39 crc kubenswrapper[4751]: I1203 14:36:39.124055 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.026785 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.083007 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-scripts\") pod \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.083087 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data-custom\") pod \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.083139 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-certs\") pod \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.083210 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-logs\") pod \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.083245 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-combined-ca-bundle\") pod \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.083284 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6cr9\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-kube-api-access-x6cr9\") pod \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.083361 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data\") pod \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\" (UID: \"e5525ea2-aea2-4432-9ba6-db64b2b2f3da\") " Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.086018 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-logs" (OuterVolumeSpecName: "logs") pod "e5525ea2-aea2-4432-9ba6-db64b2b2f3da" (UID: "e5525ea2-aea2-4432-9ba6-db64b2b2f3da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.090776 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-kube-api-access-x6cr9" (OuterVolumeSpecName: "kube-api-access-x6cr9") pod "e5525ea2-aea2-4432-9ba6-db64b2b2f3da" (UID: "e5525ea2-aea2-4432-9ba6-db64b2b2f3da"). InnerVolumeSpecName "kube-api-access-x6cr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.090883 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-certs" (OuterVolumeSpecName: "certs") pod "e5525ea2-aea2-4432-9ba6-db64b2b2f3da" (UID: "e5525ea2-aea2-4432-9ba6-db64b2b2f3da"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.091241 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e5525ea2-aea2-4432-9ba6-db64b2b2f3da","Type":"ContainerDied","Data":"647721f8ec01ae8dc8457ce7ef851e0fc21d0e23405d0a7acfefab35c76085ef"} Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.091293 4751 scope.go:117] "RemoveContainer" containerID="903b182090200583941ca11a71a154606fe827131614a059f77636acbb841198" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.091293 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.091700 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerName="cinder-scheduler" containerID="cri-o://d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292" gracePeriod=30 Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.091823 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerName="probe" containerID="cri-o://e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb" gracePeriod=30 Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.098616 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e5525ea2-aea2-4432-9ba6-db64b2b2f3da" (UID: "e5525ea2-aea2-4432-9ba6-db64b2b2f3da"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.101452 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-scripts" (OuterVolumeSpecName: "scripts") pod "e5525ea2-aea2-4432-9ba6-db64b2b2f3da" (UID: "e5525ea2-aea2-4432-9ba6-db64b2b2f3da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.119226 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5525ea2-aea2-4432-9ba6-db64b2b2f3da" (UID: "e5525ea2-aea2-4432-9ba6-db64b2b2f3da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.127143 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data" (OuterVolumeSpecName: "config-data") pod "e5525ea2-aea2-4432-9ba6-db64b2b2f3da" (UID: "e5525ea2-aea2-4432-9ba6-db64b2b2f3da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.185311 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.185365 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.185380 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.185391 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.185402 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6cr9\" (UniqueName: \"kubernetes.io/projected/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-kube-api-access-x6cr9\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.185415 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.185425 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5525ea2-aea2-4432-9ba6-db64b2b2f3da-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.217891 4751 scope.go:117] "RemoveContainer" containerID="005633cfa5919c5f9ac3650de6e2ce8a06eac7b5cd0cecfb58ae492a0c366e01" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.423469 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.431903 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.450168 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:36:40 crc kubenswrapper[4751]: E1203 14:36:40.450559 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1846d92-9079-4edd-9c41-fdfba9354a2a" containerName="dnsmasq-dns" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.450575 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1846d92-9079-4edd-9c41-fdfba9354a2a" containerName="dnsmasq-dns" Dec 03 14:36:40 crc kubenswrapper[4751]: E1203 14:36:40.450592 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerName="cloudkitty-api-log" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.450598 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerName="cloudkitty-api-log" Dec 03 14:36:40 crc kubenswrapper[4751]: E1203 14:36:40.450606 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1846d92-9079-4edd-9c41-fdfba9354a2a" containerName="init" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.450613 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1846d92-9079-4edd-9c41-fdfba9354a2a" containerName="init" Dec 03 14:36:40 crc kubenswrapper[4751]: E1203 14:36:40.450641 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerName="cloudkitty-api" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.450646 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerName="cloudkitty-api" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.450819 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1846d92-9079-4edd-9c41-fdfba9354a2a" containerName="dnsmasq-dns" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.450835 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerName="cloudkitty-api-log" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.450843 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" containerName="cloudkitty-api" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.451839 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.454610 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.455490 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.463601 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.464457 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.470831 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.595786 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.595841 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntzlv\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-kube-api-access-ntzlv\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.595920 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-scripts\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.595955 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.595987 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f84a98-0911-42a1-a4f3-7858eb75ea86-logs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.596171 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.596265 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.596337 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.596418 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-certs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.699838 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.699914 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.699974 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-certs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.700034 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.700069 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntzlv\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-kube-api-access-ntzlv\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.700125 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-scripts\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.700168 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.700201 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f84a98-0911-42a1-a4f3-7858eb75ea86-logs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.700262 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.701802 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f84a98-0911-42a1-a4f3-7858eb75ea86-logs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.706964 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-certs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.707115 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.707480 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.707584 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.707905 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.708387 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.710579 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-scripts\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.721857 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntzlv\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-kube-api-access-ntzlv\") pod \"cloudkitty-api-0\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.770962 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.839947 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.840145 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:40 crc kubenswrapper[4751]: I1203 14:36:40.910200 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.110079 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerStarted","Data":"d68df586dcf7a42d897b5e6a35592b0dc3b7f1bcdbf2920ccf45275ac032d959"} Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.113097 4751 generic.go:334] "Generic (PLEG): container finished" podID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerID="e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb" exitCode=0 Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.113132 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f903006b-ff42-40bb-9572-6e907afa1f0e","Type":"ContainerDied","Data":"e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb"} Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.168833 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.353458 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5525ea2-aea2-4432-9ba6-db64b2b2f3da" path="/var/lib/kubelet/pods/e5525ea2-aea2-4432-9ba6-db64b2b2f3da/volumes" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.386513 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.620975 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.742311 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-combined-ca-bundle\") pod \"f903006b-ff42-40bb-9572-6e907afa1f0e\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.742654 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data-custom\") pod \"f903006b-ff42-40bb-9572-6e907afa1f0e\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.742689 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx7fr\" (UniqueName: \"kubernetes.io/projected/f903006b-ff42-40bb-9572-6e907afa1f0e-kube-api-access-wx7fr\") pod \"f903006b-ff42-40bb-9572-6e907afa1f0e\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.742810 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-scripts\") pod \"f903006b-ff42-40bb-9572-6e907afa1f0e\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.743016 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data\") pod \"f903006b-ff42-40bb-9572-6e907afa1f0e\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.743074 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f903006b-ff42-40bb-9572-6e907afa1f0e-etc-machine-id\") pod \"f903006b-ff42-40bb-9572-6e907afa1f0e\" (UID: \"f903006b-ff42-40bb-9572-6e907afa1f0e\") " Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.743371 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f903006b-ff42-40bb-9572-6e907afa1f0e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f903006b-ff42-40bb-9572-6e907afa1f0e" (UID: "f903006b-ff42-40bb-9572-6e907afa1f0e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.743695 4751 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f903006b-ff42-40bb-9572-6e907afa1f0e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.749498 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f903006b-ff42-40bb-9572-6e907afa1f0e" (UID: "f903006b-ff42-40bb-9572-6e907afa1f0e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.749562 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f903006b-ff42-40bb-9572-6e907afa1f0e-kube-api-access-wx7fr" (OuterVolumeSpecName: "kube-api-access-wx7fr") pod "f903006b-ff42-40bb-9572-6e907afa1f0e" (UID: "f903006b-ff42-40bb-9572-6e907afa1f0e"). InnerVolumeSpecName "kube-api-access-wx7fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.751289 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-scripts" (OuterVolumeSpecName: "scripts") pod "f903006b-ff42-40bb-9572-6e907afa1f0e" (UID: "f903006b-ff42-40bb-9572-6e907afa1f0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.822745 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f903006b-ff42-40bb-9572-6e907afa1f0e" (UID: "f903006b-ff42-40bb-9572-6e907afa1f0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.855790 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.855882 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.855904 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx7fr\" (UniqueName: \"kubernetes.io/projected/f903006b-ff42-40bb-9572-6e907afa1f0e-kube-api-access-wx7fr\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.855942 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.889575 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data" (OuterVolumeSpecName: "config-data") pod "f903006b-ff42-40bb-9572-6e907afa1f0e" (UID: "f903006b-ff42-40bb-9572-6e907afa1f0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:41 crc kubenswrapper[4751]: I1203 14:36:41.957387 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f903006b-ff42-40bb-9572-6e907afa1f0e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.072347 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmdtc"] Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.129792 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerStarted","Data":"474b632ca34764eba6a79153a2f8678f814f73937e091cbc22f1cda425c0974e"} Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.129997 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.131575 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52f84a98-0911-42a1-a4f3-7858eb75ea86","Type":"ContainerStarted","Data":"8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959"} Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.131619 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52f84a98-0911-42a1-a4f3-7858eb75ea86","Type":"ContainerStarted","Data":"b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938"} Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.131630 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52f84a98-0911-42a1-a4f3-7858eb75ea86","Type":"ContainerStarted","Data":"43883af2bb3344d8f1940d8bc373e57dbf99d3f44fcfbebc547d522cc19bfb95"} Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.131676 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.133281 4751 generic.go:334] "Generic (PLEG): container finished" podID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerID="d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292" exitCode=0 Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.133517 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f903006b-ff42-40bb-9572-6e907afa1f0e","Type":"ContainerDied","Data":"d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292"} Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.133560 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f903006b-ff42-40bb-9572-6e907afa1f0e","Type":"ContainerDied","Data":"32a8cc6022da47b15ab8e064c6bbdb22f839808a21fc154a18c0bbfa202e8e10"} Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.133586 4751 scope.go:117] "RemoveContainer" containerID="e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.133636 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.159043 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.786466149 podStartE2EDuration="7.159015257s" podCreationTimestamp="2025-12-03 14:36:35 +0000 UTC" firstStartedPulling="2025-12-03 14:36:36.505737927 +0000 UTC m=+1403.494093144" lastFinishedPulling="2025-12-03 14:36:41.878287035 +0000 UTC m=+1408.866642252" observedRunningTime="2025-12-03 14:36:42.152467919 +0000 UTC m=+1409.140823136" watchObservedRunningTime="2025-12-03 14:36:42.159015257 +0000 UTC m=+1409.147370484" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.172507 4751 scope.go:117] "RemoveContainer" containerID="d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.192962 4751 scope.go:117] "RemoveContainer" containerID="e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb" Dec 03 14:36:42 crc kubenswrapper[4751]: E1203 14:36:42.193566 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb\": container with ID starting with e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb not found: ID does not exist" containerID="e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.193626 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb"} err="failed to get container status \"e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb\": rpc error: code = NotFound desc = could not find container \"e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb\": container with ID starting with e70bdc7d7024497428956ee659ef11f0380f37bdc30818352b8694fdc59494fb not found: ID does not exist" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.193658 4751 scope.go:117] "RemoveContainer" containerID="d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292" Dec 03 14:36:42 crc kubenswrapper[4751]: E1203 14:36:42.193968 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292\": container with ID starting with d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292 not found: ID does not exist" containerID="d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.194004 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292"} err="failed to get container status \"d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292\": rpc error: code = NotFound desc = could not find container \"d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292\": container with ID starting with d0dff38102c7a63859cc8440486910495f137186eb637a0d2a2a8a607f5bc292 not found: ID does not exist" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.198488 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.220417 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.234813 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.234789742 podStartE2EDuration="2.234789742s" podCreationTimestamp="2025-12-03 14:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:42.200000284 +0000 UTC m=+1409.188355501" watchObservedRunningTime="2025-12-03 14:36:42.234789742 +0000 UTC m=+1409.223144959" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.276386 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:36:42 crc kubenswrapper[4751]: E1203 14:36:42.277236 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerName="probe" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.277257 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerName="probe" Dec 03 14:36:42 crc kubenswrapper[4751]: E1203 14:36:42.277309 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerName="cinder-scheduler" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.277319 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerName="cinder-scheduler" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.277574 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerName="probe" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.277600 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f903006b-ff42-40bb-9572-6e907afa1f0e" containerName="cinder-scheduler" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.279014 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.281680 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.301633 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.365393 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.365551 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-scripts\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.365601 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28a23243-c107-4c51-96f0-82db8946b245-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.365615 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-config-data\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.365636 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6cp\" (UniqueName: \"kubernetes.io/projected/28a23243-c107-4c51-96f0-82db8946b245-kube-api-access-cn6cp\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.365652 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.467091 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-scripts\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.467197 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28a23243-c107-4c51-96f0-82db8946b245-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.467223 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-config-data\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.467252 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6cp\" (UniqueName: \"kubernetes.io/projected/28a23243-c107-4c51-96f0-82db8946b245-kube-api-access-cn6cp\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.467275 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.467350 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.467639 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28a23243-c107-4c51-96f0-82db8946b245-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.473255 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-config-data\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.473880 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.474443 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.489734 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a23243-c107-4c51-96f0-82db8946b245-scripts\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.493395 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6cp\" (UniqueName: \"kubernetes.io/projected/28a23243-c107-4c51-96f0-82db8946b245-kube-api-access-cn6cp\") pod \"cinder-scheduler-0\" (UID: \"28a23243-c107-4c51-96f0-82db8946b245\") " pod="openstack/cinder-scheduler-0" Dec 03 14:36:42 crc kubenswrapper[4751]: I1203 14:36:42.613954 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.093152 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.143567 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28a23243-c107-4c51-96f0-82db8946b245","Type":"ContainerStarted","Data":"14b467b1f0b96eb36501c05b395be8e7f9f155ee174595fdcb31d2ea5846a69c"} Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.146165 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tmdtc" podUID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerName="registry-server" containerID="cri-o://a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c" gracePeriod=2 Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.347958 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f903006b-ff42-40bb-9572-6e907afa1f0e" path="/var/lib/kubelet/pods/f903006b-ff42-40bb-9572-6e907afa1f0e/volumes" Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.360720 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.470057 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-46f56"] Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.470479 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-46f56" podUID="0bc13bae-e4f7-49e3-9755-46e807f23efc" containerName="dnsmasq-dns" containerID="cri-o://a5d2cf4ddcb1ea926592f27cd654fa69e80a380a65308fd335d8d3df7ee85403" gracePeriod=10 Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.768632 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.912548 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-catalog-content\") pod \"3506627c-9636-4af8-ad7f-78db94f0ff11\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.912655 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq72k\" (UniqueName: \"kubernetes.io/projected/3506627c-9636-4af8-ad7f-78db94f0ff11-kube-api-access-mq72k\") pod \"3506627c-9636-4af8-ad7f-78db94f0ff11\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.912698 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-utilities\") pod \"3506627c-9636-4af8-ad7f-78db94f0ff11\" (UID: \"3506627c-9636-4af8-ad7f-78db94f0ff11\") " Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.913996 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-utilities" (OuterVolumeSpecName: "utilities") pod "3506627c-9636-4af8-ad7f-78db94f0ff11" (UID: "3506627c-9636-4af8-ad7f-78db94f0ff11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.937306 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3506627c-9636-4af8-ad7f-78db94f0ff11-kube-api-access-mq72k" (OuterVolumeSpecName: "kube-api-access-mq72k") pod "3506627c-9636-4af8-ad7f-78db94f0ff11" (UID: "3506627c-9636-4af8-ad7f-78db94f0ff11"). InnerVolumeSpecName "kube-api-access-mq72k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:43 crc kubenswrapper[4751]: I1203 14:36:43.974405 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3506627c-9636-4af8-ad7f-78db94f0ff11" (UID: "3506627c-9636-4af8-ad7f-78db94f0ff11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.014596 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.014859 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq72k\" (UniqueName: \"kubernetes.io/projected/3506627c-9636-4af8-ad7f-78db94f0ff11-kube-api-access-mq72k\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.014921 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3506627c-9636-4af8-ad7f-78db94f0ff11-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.205288 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28a23243-c107-4c51-96f0-82db8946b245","Type":"ContainerStarted","Data":"8a962ee9ad3a64e595affae5c485338e721ac6072c198360c94ac7a090b8d5f3"} Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.218783 4751 generic.go:334] "Generic (PLEG): container finished" podID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerID="a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c" exitCode=0 Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.218842 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmdtc" event={"ID":"3506627c-9636-4af8-ad7f-78db94f0ff11","Type":"ContainerDied","Data":"a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c"} Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.218900 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmdtc" event={"ID":"3506627c-9636-4af8-ad7f-78db94f0ff11","Type":"ContainerDied","Data":"cbca48a20bd8feb1a6a8ace706988711b04270d2f425fb47789347257cd7cc36"} Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.218919 4751 scope.go:117] "RemoveContainer" containerID="a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.219044 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmdtc" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.240058 4751 generic.go:334] "Generic (PLEG): container finished" podID="0bc13bae-e4f7-49e3-9755-46e807f23efc" containerID="a5d2cf4ddcb1ea926592f27cd654fa69e80a380a65308fd335d8d3df7ee85403" exitCode=0 Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.240096 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-46f56" event={"ID":"0bc13bae-e4f7-49e3-9755-46e807f23efc","Type":"ContainerDied","Data":"a5d2cf4ddcb1ea926592f27cd654fa69e80a380a65308fd335d8d3df7ee85403"} Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.304632 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.310443 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmdtc"] Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.320884 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tmdtc"] Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.439906 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-nb\") pod \"0bc13bae-e4f7-49e3-9755-46e807f23efc\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.439983 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-svc\") pod \"0bc13bae-e4f7-49e3-9755-46e807f23efc\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.440076 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-config\") pod \"0bc13bae-e4f7-49e3-9755-46e807f23efc\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.440102 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qtpz\" (UniqueName: \"kubernetes.io/projected/0bc13bae-e4f7-49e3-9755-46e807f23efc-kube-api-access-6qtpz\") pod \"0bc13bae-e4f7-49e3-9755-46e807f23efc\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.440136 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-sb\") pod \"0bc13bae-e4f7-49e3-9755-46e807f23efc\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.440218 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-swift-storage-0\") pod \"0bc13bae-e4f7-49e3-9755-46e807f23efc\" (UID: \"0bc13bae-e4f7-49e3-9755-46e807f23efc\") " Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.444910 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc13bae-e4f7-49e3-9755-46e807f23efc-kube-api-access-6qtpz" (OuterVolumeSpecName: "kube-api-access-6qtpz") pod "0bc13bae-e4f7-49e3-9755-46e807f23efc" (UID: "0bc13bae-e4f7-49e3-9755-46e807f23efc"). InnerVolumeSpecName "kube-api-access-6qtpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.453263 4751 scope.go:117] "RemoveContainer" containerID="a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.492161 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0bc13bae-e4f7-49e3-9755-46e807f23efc" (UID: "0bc13bae-e4f7-49e3-9755-46e807f23efc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.510286 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-config" (OuterVolumeSpecName: "config") pod "0bc13bae-e4f7-49e3-9755-46e807f23efc" (UID: "0bc13bae-e4f7-49e3-9755-46e807f23efc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.519298 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0bc13bae-e4f7-49e3-9755-46e807f23efc" (UID: "0bc13bae-e4f7-49e3-9755-46e807f23efc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.526588 4751 scope.go:117] "RemoveContainer" containerID="425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.537928 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0bc13bae-e4f7-49e3-9755-46e807f23efc" (UID: "0bc13bae-e4f7-49e3-9755-46e807f23efc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.542818 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.542846 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.542856 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.542866 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qtpz\" (UniqueName: \"kubernetes.io/projected/0bc13bae-e4f7-49e3-9755-46e807f23efc-kube-api-access-6qtpz\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.542878 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.551120 4751 scope.go:117] "RemoveContainer" containerID="a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c" Dec 03 14:36:44 crc kubenswrapper[4751]: E1203 14:36:44.554454 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c\": container with ID starting with a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c not found: ID does not exist" containerID="a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.554495 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c"} err="failed to get container status \"a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c\": rpc error: code = NotFound desc = could not find container \"a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c\": container with ID starting with a0c455f1e5001615f3ae562adf2c50dfedc85450fca2d21952f52a269a4af59c not found: ID does not exist" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.554523 4751 scope.go:117] "RemoveContainer" containerID="a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809" Dec 03 14:36:44 crc kubenswrapper[4751]: E1203 14:36:44.554915 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809\": container with ID starting with a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809 not found: ID does not exist" containerID="a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.554948 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809"} err="failed to get container status \"a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809\": rpc error: code = NotFound desc = could not find container \"a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809\": container with ID starting with a62bce5905ebee14c320597fbf629e2c68e08d3757758c4a8a726b17934b1809 not found: ID does not exist" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.554967 4751 scope.go:117] "RemoveContainer" containerID="425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f" Dec 03 14:36:44 crc kubenswrapper[4751]: E1203 14:36:44.555142 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f\": container with ID starting with 425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f not found: ID does not exist" containerID="425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.555164 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f"} err="failed to get container status \"425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f\": rpc error: code = NotFound desc = could not find container \"425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f\": container with ID starting with 425686ede2060d692eea11dfccf88d2d53fac4f7e72f40faed2fd89f9f25824f not found: ID does not exist" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.557622 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0bc13bae-e4f7-49e3-9755-46e807f23efc" (UID: "0bc13bae-e4f7-49e3-9755-46e807f23efc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:44 crc kubenswrapper[4751]: I1203 14:36:44.644740 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bc13bae-e4f7-49e3-9755-46e807f23efc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:45 crc kubenswrapper[4751]: E1203 14:36:45.172001 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3d433ac_6236_4b80_96f5_9047aed57c8f.slice/crio-conmon-e099f63df6df1705cdfe575bd5adc863e392aac4ccd1fa4f0469f71e130facad.scope\": RecentStats: unable to find data in memory cache]" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.258226 4751 generic.go:334] "Generic (PLEG): container finished" podID="c3d433ac-6236-4b80-96f5-9047aed57c8f" containerID="e099f63df6df1705cdfe575bd5adc863e392aac4ccd1fa4f0469f71e130facad" exitCode=0 Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.258316 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"c3d433ac-6236-4b80-96f5-9047aed57c8f","Type":"ContainerDied","Data":"e099f63df6df1705cdfe575bd5adc863e392aac4ccd1fa4f0469f71e130facad"} Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.260633 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-46f56" event={"ID":"0bc13bae-e4f7-49e3-9755-46e807f23efc","Type":"ContainerDied","Data":"22d787fff3a5fccf5753b2d9579a6e6e112041d6dce988ef6107d9d65a995222"} Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.260679 4751 scope.go:117] "RemoveContainer" containerID="a5d2cf4ddcb1ea926592f27cd654fa69e80a380a65308fd335d8d3df7ee85403" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.260773 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-46f56" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.358277 4751 scope.go:117] "RemoveContainer" containerID="5cf1b16da5e5690acf3447f4d02fe2592b436cc14a3a4560715cfb032c662bed" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.387178 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3506627c-9636-4af8-ad7f-78db94f0ff11" path="/var/lib/kubelet/pods/3506627c-9636-4af8-ad7f-78db94f0ff11/volumes" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.387893 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-46f56"] Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.387922 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-46f56"] Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.570441 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.681457 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-combined-ca-bundle\") pod \"c3d433ac-6236-4b80-96f5-9047aed57c8f\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.681848 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-scripts\") pod \"c3d433ac-6236-4b80-96f5-9047aed57c8f\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.681893 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data\") pod \"c3d433ac-6236-4b80-96f5-9047aed57c8f\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.681931 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-certs\") pod \"c3d433ac-6236-4b80-96f5-9047aed57c8f\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.681953 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data-custom\") pod \"c3d433ac-6236-4b80-96f5-9047aed57c8f\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.682103 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x22k\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-kube-api-access-5x22k\") pod \"c3d433ac-6236-4b80-96f5-9047aed57c8f\" (UID: \"c3d433ac-6236-4b80-96f5-9047aed57c8f\") " Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.709961 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-certs" (OuterVolumeSpecName: "certs") pod "c3d433ac-6236-4b80-96f5-9047aed57c8f" (UID: "c3d433ac-6236-4b80-96f5-9047aed57c8f"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.715673 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-kube-api-access-5x22k" (OuterVolumeSpecName: "kube-api-access-5x22k") pod "c3d433ac-6236-4b80-96f5-9047aed57c8f" (UID: "c3d433ac-6236-4b80-96f5-9047aed57c8f"). InnerVolumeSpecName "kube-api-access-5x22k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.715808 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c3d433ac-6236-4b80-96f5-9047aed57c8f" (UID: "c3d433ac-6236-4b80-96f5-9047aed57c8f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.716711 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-scripts" (OuterVolumeSpecName: "scripts") pod "c3d433ac-6236-4b80-96f5-9047aed57c8f" (UID: "c3d433ac-6236-4b80-96f5-9047aed57c8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.755817 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3d433ac-6236-4b80-96f5-9047aed57c8f" (UID: "c3d433ac-6236-4b80-96f5-9047aed57c8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.782429 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data" (OuterVolumeSpecName: "config-data") pod "c3d433ac-6236-4b80-96f5-9047aed57c8f" (UID: "c3d433ac-6236-4b80-96f5-9047aed57c8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.783623 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x22k\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-kube-api-access-5x22k\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.783641 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.783651 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.783660 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.783668 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3d433ac-6236-4b80-96f5-9047aed57c8f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:45 crc kubenswrapper[4751]: I1203 14:36:45.783676 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c3d433ac-6236-4b80-96f5-9047aed57c8f-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.292429 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"c3d433ac-6236-4b80-96f5-9047aed57c8f","Type":"ContainerDied","Data":"30f9ff8dceb8dab824c95ceb9995dde17ef13a972500cbeb61e87695f62007e2"} Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.292948 4751 scope.go:117] "RemoveContainer" containerID="e099f63df6df1705cdfe575bd5adc863e392aac4ccd1fa4f0469f71e130facad" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.293133 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.353478 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28a23243-c107-4c51-96f0-82db8946b245","Type":"ContainerStarted","Data":"4cd74c566e67e219ae3681f30db82709f7a4eccbfc32ff2906dff646f7d96350"} Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.386316 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.401229 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.409637 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:36:46 crc kubenswrapper[4751]: E1203 14:36:46.410292 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc13bae-e4f7-49e3-9755-46e807f23efc" containerName="init" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.410340 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc13bae-e4f7-49e3-9755-46e807f23efc" containerName="init" Dec 03 14:36:46 crc kubenswrapper[4751]: E1203 14:36:46.410360 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerName="extract-content" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.410368 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerName="extract-content" Dec 03 14:36:46 crc kubenswrapper[4751]: E1203 14:36:46.410410 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerName="extract-utilities" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.410417 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerName="extract-utilities" Dec 03 14:36:46 crc kubenswrapper[4751]: E1203 14:36:46.410426 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerName="registry-server" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.410432 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerName="registry-server" Dec 03 14:36:46 crc kubenswrapper[4751]: E1203 14:36:46.410454 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc13bae-e4f7-49e3-9755-46e807f23efc" containerName="dnsmasq-dns" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.410475 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc13bae-e4f7-49e3-9755-46e807f23efc" containerName="dnsmasq-dns" Dec 03 14:36:46 crc kubenswrapper[4751]: E1203 14:36:46.410483 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d433ac-6236-4b80-96f5-9047aed57c8f" containerName="cloudkitty-proc" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.410489 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d433ac-6236-4b80-96f5-9047aed57c8f" containerName="cloudkitty-proc" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.410750 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="3506627c-9636-4af8-ad7f-78db94f0ff11" containerName="registry-server" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.410796 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc13bae-e4f7-49e3-9755-46e807f23efc" containerName="dnsmasq-dns" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.410803 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d433ac-6236-4b80-96f5-9047aed57c8f" containerName="cloudkitty-proc" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.411777 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.414491 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.420756 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.434143 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.434123323 podStartE2EDuration="4.434123323s" podCreationTimestamp="2025-12-03 14:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:46.387735439 +0000 UTC m=+1413.376090656" watchObservedRunningTime="2025-12-03 14:36:46.434123323 +0000 UTC m=+1413.422478540" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.602377 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.602585 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-certs\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.602695 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.602730 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg7kj\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-kube-api-access-cg7kj\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.602815 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-scripts\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.602874 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.704764 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-scripts\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.705171 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.705214 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.705371 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-certs\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.705461 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.705505 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg7kj\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-kube-api-access-cg7kj\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.732531 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-certs\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.732589 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.732772 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-scripts\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.733027 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg7kj\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-kube-api-access-cg7kj\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.733159 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.748961 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.772164 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:46 crc kubenswrapper[4751]: I1203 14:36:46.799031 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-cfc86c59b-x4m2l" Dec 03 14:36:47 crc kubenswrapper[4751]: I1203 14:36:47.033798 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:36:47 crc kubenswrapper[4751]: I1203 14:36:47.332857 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc13bae-e4f7-49e3-9755-46e807f23efc" path="/var/lib/kubelet/pods/0bc13bae-e4f7-49e3-9755-46e807f23efc/volumes" Dec 03 14:36:47 crc kubenswrapper[4751]: I1203 14:36:47.333851 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3d433ac-6236-4b80-96f5-9047aed57c8f" path="/var/lib/kubelet/pods/c3d433ac-6236-4b80-96f5-9047aed57c8f/volumes" Dec 03 14:36:47 crc kubenswrapper[4751]: I1203 14:36:47.508263 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:36:47 crc kubenswrapper[4751]: I1203 14:36:47.614578 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 14:36:47 crc kubenswrapper[4751]: I1203 14:36:47.742939 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7f9cb9cd-blmhg" Dec 03 14:36:48 crc kubenswrapper[4751]: I1203 14:36:48.388071 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d","Type":"ContainerStarted","Data":"9902f8a9bdb54179cac523b0ab0c5b1136aa809f84825250647d3c33bdeeb0a7"} Dec 03 14:36:48 crc kubenswrapper[4751]: I1203 14:36:48.388468 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d","Type":"ContainerStarted","Data":"95c6e6d638153f5d5ddedd3c60eee77f1daa3c1f711393ca7777ae2a2a238be6"} Dec 03 14:36:48 crc kubenswrapper[4751]: I1203 14:36:48.410679 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.410654288 podStartE2EDuration="2.410654288s" podCreationTimestamp="2025-12-03 14:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:36:48.404618674 +0000 UTC m=+1415.392973891" watchObservedRunningTime="2025-12-03 14:36:48.410654288 +0000 UTC m=+1415.399009505" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.424663 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.426691 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.429892 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.429916 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.433083 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lgcj8" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.448081 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.522610 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.523034 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.523131 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7szb9\" (UniqueName: \"kubernetes.io/projected/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-kube-api-access-7szb9\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.523294 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config-secret\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.624753 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.625946 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7szb9\" (UniqueName: \"kubernetes.io/projected/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-kube-api-access-7szb9\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.626232 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config-secret\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.627513 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.625886 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.634081 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config-secret\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.644197 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.654993 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7szb9\" (UniqueName: \"kubernetes.io/projected/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-kube-api-access-7szb9\") pod \"openstackclient\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.755598 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.789796 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.798144 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.849728 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.851121 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.880385 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 14:36:50 crc kubenswrapper[4751]: E1203 14:36:50.910400 4751 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 14:36:50 crc kubenswrapper[4751]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_2b19ff83-ee1a-4c2d-bf5c-832ef5800519_0(7602ea67f61ee9dde1ffd5e05c0a4b21a916fa88bd0db239d303fc73e94dce49): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7602ea67f61ee9dde1ffd5e05c0a4b21a916fa88bd0db239d303fc73e94dce49" Netns:"/var/run/netns/a6f69560-448e-4d34-8eb9-a6f7cb644e7d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=7602ea67f61ee9dde1ffd5e05c0a4b21a916fa88bd0db239d303fc73e94dce49;K8S_POD_UID=2b19ff83-ee1a-4c2d-bf5c-832ef5800519" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/2b19ff83-ee1a-4c2d-bf5c-832ef5800519]: expected pod UID "2b19ff83-ee1a-4c2d-bf5c-832ef5800519" but got "033a5c7c-11ef-4610-ac41-aa8471a9f0b4" from Kube API Dec 03 14:36:50 crc kubenswrapper[4751]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 14:36:50 crc kubenswrapper[4751]: > Dec 03 14:36:50 crc kubenswrapper[4751]: E1203 14:36:50.910456 4751 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 14:36:50 crc kubenswrapper[4751]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_2b19ff83-ee1a-4c2d-bf5c-832ef5800519_0(7602ea67f61ee9dde1ffd5e05c0a4b21a916fa88bd0db239d303fc73e94dce49): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7602ea67f61ee9dde1ffd5e05c0a4b21a916fa88bd0db239d303fc73e94dce49" Netns:"/var/run/netns/a6f69560-448e-4d34-8eb9-a6f7cb644e7d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=7602ea67f61ee9dde1ffd5e05c0a4b21a916fa88bd0db239d303fc73e94dce49;K8S_POD_UID=2b19ff83-ee1a-4c2d-bf5c-832ef5800519" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/2b19ff83-ee1a-4c2d-bf5c-832ef5800519]: expected pod UID "2b19ff83-ee1a-4c2d-bf5c-832ef5800519" but got "033a5c7c-11ef-4610-ac41-aa8471a9f0b4" from Kube API Dec 03 14:36:50 crc kubenswrapper[4751]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 14:36:50 crc kubenswrapper[4751]: > pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.933678 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfjw7\" (UniqueName: \"kubernetes.io/projected/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-kube-api-access-qfjw7\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.933724 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.933930 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-openstack-config\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:50 crc kubenswrapper[4751]: I1203 14:36:50.933968 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.035550 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-openstack-config\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.035620 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.035670 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfjw7\" (UniqueName: \"kubernetes.io/projected/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-kube-api-access-qfjw7\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.035695 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.037021 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-openstack-config\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.040635 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.041134 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.057021 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfjw7\" (UniqueName: \"kubernetes.io/projected/033a5c7c-11ef-4610-ac41-aa8471a9f0b4-kube-api-access-qfjw7\") pod \"openstackclient\" (UID: \"033a5c7c-11ef-4610-ac41-aa8471a9f0b4\") " pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.264848 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.432811 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.447120 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.451412 4751 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="2b19ff83-ee1a-4c2d-bf5c-832ef5800519" podUID="033a5c7c-11ef-4610-ac41-aa8471a9f0b4" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.545456 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config\") pod \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.545637 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7szb9\" (UniqueName: \"kubernetes.io/projected/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-kube-api-access-7szb9\") pod \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.545793 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config-secret\") pod \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.545818 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-combined-ca-bundle\") pod \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\" (UID: \"2b19ff83-ee1a-4c2d-bf5c-832ef5800519\") " Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.547749 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "2b19ff83-ee1a-4c2d-bf5c-832ef5800519" (UID: "2b19ff83-ee1a-4c2d-bf5c-832ef5800519"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.550613 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b19ff83-ee1a-4c2d-bf5c-832ef5800519" (UID: "2b19ff83-ee1a-4c2d-bf5c-832ef5800519"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.551528 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "2b19ff83-ee1a-4c2d-bf5c-832ef5800519" (UID: "2b19ff83-ee1a-4c2d-bf5c-832ef5800519"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.554625 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-kube-api-access-7szb9" (OuterVolumeSpecName: "kube-api-access-7szb9") pod "2b19ff83-ee1a-4c2d-bf5c-832ef5800519" (UID: "2b19ff83-ee1a-4c2d-bf5c-832ef5800519"). InnerVolumeSpecName "kube-api-access-7szb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.647901 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.647933 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.647942 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.647949 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7szb9\" (UniqueName: \"kubernetes.io/projected/2b19ff83-ee1a-4c2d-bf5c-832ef5800519-kube-api-access-7szb9\") on node \"crc\" DevicePath \"\"" Dec 03 14:36:51 crc kubenswrapper[4751]: I1203 14:36:51.719277 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 14:36:52 crc kubenswrapper[4751]: I1203 14:36:52.447867 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 14:36:52 crc kubenswrapper[4751]: I1203 14:36:52.450422 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"033a5c7c-11ef-4610-ac41-aa8471a9f0b4","Type":"ContainerStarted","Data":"2f704472135f9186018162e8d1e671159bca26b98b96263551bc15f6cdb26bfb"} Dec 03 14:36:52 crc kubenswrapper[4751]: I1203 14:36:52.463450 4751 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="2b19ff83-ee1a-4c2d-bf5c-832ef5800519" podUID="033a5c7c-11ef-4610-ac41-aa8471a9f0b4" Dec 03 14:36:52 crc kubenswrapper[4751]: I1203 14:36:52.857987 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 14:36:53 crc kubenswrapper[4751]: I1203 14:36:53.328052 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b19ff83-ee1a-4c2d-bf5c-832ef5800519" path="/var/lib/kubelet/pods/2b19ff83-ee1a-4c2d-bf5c-832ef5800519/volumes" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.333922 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-84ff798d87-5c96l"] Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.335782 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.337737 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.337884 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.342965 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.355991 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-84ff798d87-5c96l"] Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.427383 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c7e0fc7-03ed-4002-b460-df87d151f563-log-httpd\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.427790 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-config-data\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.428021 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzrgd\" (UniqueName: \"kubernetes.io/projected/9c7e0fc7-03ed-4002-b460-df87d151f563-kube-api-access-kzrgd\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.428242 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9c7e0fc7-03ed-4002-b460-df87d151f563-etc-swift\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.428371 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-internal-tls-certs\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.428413 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-combined-ca-bundle\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.428441 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c7e0fc7-03ed-4002-b460-df87d151f563-run-httpd\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.428460 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-public-tls-certs\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.530378 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-config-data\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.530924 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzrgd\" (UniqueName: \"kubernetes.io/projected/9c7e0fc7-03ed-4002-b460-df87d151f563-kube-api-access-kzrgd\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.531069 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9c7e0fc7-03ed-4002-b460-df87d151f563-etc-swift\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.531174 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-internal-tls-certs\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.531246 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-combined-ca-bundle\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.531315 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c7e0fc7-03ed-4002-b460-df87d151f563-run-httpd\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.531400 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-public-tls-certs\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.531514 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c7e0fc7-03ed-4002-b460-df87d151f563-log-httpd\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.532004 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c7e0fc7-03ed-4002-b460-df87d151f563-log-httpd\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.532625 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c7e0fc7-03ed-4002-b460-df87d151f563-run-httpd\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.537586 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-public-tls-certs\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.538037 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-combined-ca-bundle\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.538038 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9c7e0fc7-03ed-4002-b460-df87d151f563-etc-swift\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.538552 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-internal-tls-certs\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.543260 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c7e0fc7-03ed-4002-b460-df87d151f563-config-data\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.548887 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzrgd\" (UniqueName: \"kubernetes.io/projected/9c7e0fc7-03ed-4002-b460-df87d151f563-kube-api-access-kzrgd\") pod \"swift-proxy-84ff798d87-5c96l\" (UID: \"9c7e0fc7-03ed-4002-b460-df87d151f563\") " pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:55 crc kubenswrapper[4751]: I1203 14:36:55.655599 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:36:56 crc kubenswrapper[4751]: I1203 14:36:56.243887 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-84ff798d87-5c96l"] Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.084809 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.087482 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="ceilometer-central-agent" containerID="cri-o://140d4f247f184cab56abb6c9a63b668e98b54b136f9c22a120bc01761693764f" gracePeriod=30 Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.087871 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="proxy-httpd" containerID="cri-o://474b632ca34764eba6a79153a2f8678f814f73937e091cbc22f1cda425c0974e" gracePeriod=30 Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.087958 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="sg-core" containerID="cri-o://d68df586dcf7a42d897b5e6a35592b0dc3b7f1bcdbf2920ccf45275ac032d959" gracePeriod=30 Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.088016 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="ceilometer-notification-agent" containerID="cri-o://24003421e57923c6199a8e1dfb412c6d702c4649da37c29574a520e9fa6f3964" gracePeriod=30 Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.189160 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.189:3000/\": read tcp 10.217.0.2:52892->10.217.0.189:3000: read: connection reset by peer" Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.511580 4751 generic.go:334] "Generic (PLEG): container finished" podID="b0cff45f-8db2-44d7-9723-677508c4c442" containerID="474b632ca34764eba6a79153a2f8678f814f73937e091cbc22f1cda425c0974e" exitCode=0 Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.511615 4751 generic.go:334] "Generic (PLEG): container finished" podID="b0cff45f-8db2-44d7-9723-677508c4c442" containerID="d68df586dcf7a42d897b5e6a35592b0dc3b7f1bcdbf2920ccf45275ac032d959" exitCode=2 Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.511635 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerDied","Data":"474b632ca34764eba6a79153a2f8678f814f73937e091cbc22f1cda425c0974e"} Dec 03 14:36:58 crc kubenswrapper[4751]: I1203 14:36:58.511660 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerDied","Data":"d68df586dcf7a42d897b5e6a35592b0dc3b7f1bcdbf2920ccf45275ac032d959"} Dec 03 14:36:59 crc kubenswrapper[4751]: I1203 14:36:59.525533 4751 generic.go:334] "Generic (PLEG): container finished" podID="b0cff45f-8db2-44d7-9723-677508c4c442" containerID="140d4f247f184cab56abb6c9a63b668e98b54b136f9c22a120bc01761693764f" exitCode=0 Dec 03 14:36:59 crc kubenswrapper[4751]: I1203 14:36:59.525614 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerDied","Data":"140d4f247f184cab56abb6c9a63b668e98b54b136f9c22a120bc01761693764f"} Dec 03 14:37:01 crc kubenswrapper[4751]: W1203 14:37:01.052913 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c7e0fc7_03ed_4002_b460_df87d151f563.slice/crio-83a8a5caf38c43d566653e58e6312d7c475cd2d797a5a1d4c839548b940c57de WatchSource:0}: Error finding container 83a8a5caf38c43d566653e58e6312d7c475cd2d797a5a1d4c839548b940c57de: Status 404 returned error can't find the container with id 83a8a5caf38c43d566653e58e6312d7c475cd2d797a5a1d4c839548b940c57de Dec 03 14:37:01 crc kubenswrapper[4751]: I1203 14:37:01.551417 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"033a5c7c-11ef-4610-ac41-aa8471a9f0b4","Type":"ContainerStarted","Data":"826f3c5fe83fb842313b18006a37244161626303046a84b44c16692f65156414"} Dec 03 14:37:01 crc kubenswrapper[4751]: I1203 14:37:01.554169 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84ff798d87-5c96l" event={"ID":"9c7e0fc7-03ed-4002-b460-df87d151f563","Type":"ContainerStarted","Data":"c873c3b2fa23ac4656d01775de2756fd4ca6a4f059f571448cdd6a86445fcb5b"} Dec 03 14:37:01 crc kubenswrapper[4751]: I1203 14:37:01.554218 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84ff798d87-5c96l" event={"ID":"9c7e0fc7-03ed-4002-b460-df87d151f563","Type":"ContainerStarted","Data":"83a8a5caf38c43d566653e58e6312d7c475cd2d797a5a1d4c839548b940c57de"} Dec 03 14:37:01 crc kubenswrapper[4751]: I1203 14:37:01.575696 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.150736894 podStartE2EDuration="11.575673004s" podCreationTimestamp="2025-12-03 14:36:50 +0000 UTC" firstStartedPulling="2025-12-03 14:36:51.73208856 +0000 UTC m=+1418.720443777" lastFinishedPulling="2025-12-03 14:37:01.15702467 +0000 UTC m=+1428.145379887" observedRunningTime="2025-12-03 14:37:01.562805922 +0000 UTC m=+1428.551161139" watchObservedRunningTime="2025-12-03 14:37:01.575673004 +0000 UTC m=+1428.564028231" Dec 03 14:37:02 crc kubenswrapper[4751]: I1203 14:37:02.566281 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84ff798d87-5c96l" event={"ID":"9c7e0fc7-03ed-4002-b460-df87d151f563","Type":"ContainerStarted","Data":"59a052b76ac4ef684c64025c2b1cf35c4cd982a1956e8a64a49c9760e5154fe0"} Dec 03 14:37:02 crc kubenswrapper[4751]: I1203 14:37:02.566829 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:37:02 crc kubenswrapper[4751]: I1203 14:37:02.566858 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:37:02 crc kubenswrapper[4751]: I1203 14:37:02.594394 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-84ff798d87-5c96l" podStartSLOduration=7.594371993 podStartE2EDuration="7.594371993s" podCreationTimestamp="2025-12-03 14:36:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:37:02.586406492 +0000 UTC m=+1429.574761709" watchObservedRunningTime="2025-12-03 14:37:02.594371993 +0000 UTC m=+1429.582727210" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.395723 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.424836 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data-custom\") pod \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.424904 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-combined-ca-bundle\") pod \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.424944 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data\") pod \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.424974 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-scripts\") pod \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.425291 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2249aa73-cbf1-4a6b-a893-d2242c236c6d-logs\") pod \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.425376 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2249aa73-cbf1-4a6b-a893-d2242c236c6d-etc-machine-id\") pod \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.425399 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs2c8\" (UniqueName: \"kubernetes.io/projected/2249aa73-cbf1-4a6b-a893-d2242c236c6d-kube-api-access-fs2c8\") pod \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\" (UID: \"2249aa73-cbf1-4a6b-a893-d2242c236c6d\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.427699 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2249aa73-cbf1-4a6b-a893-d2242c236c6d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2249aa73-cbf1-4a6b-a893-d2242c236c6d" (UID: "2249aa73-cbf1-4a6b-a893-d2242c236c6d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.428014 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2249aa73-cbf1-4a6b-a893-d2242c236c6d-logs" (OuterVolumeSpecName: "logs") pod "2249aa73-cbf1-4a6b-a893-d2242c236c6d" (UID: "2249aa73-cbf1-4a6b-a893-d2242c236c6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.439781 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-scripts" (OuterVolumeSpecName: "scripts") pod "2249aa73-cbf1-4a6b-a893-d2242c236c6d" (UID: "2249aa73-cbf1-4a6b-a893-d2242c236c6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.440663 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2249aa73-cbf1-4a6b-a893-d2242c236c6d" (UID: "2249aa73-cbf1-4a6b-a893-d2242c236c6d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.443598 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2249aa73-cbf1-4a6b-a893-d2242c236c6d-kube-api-access-fs2c8" (OuterVolumeSpecName: "kube-api-access-fs2c8") pod "2249aa73-cbf1-4a6b-a893-d2242c236c6d" (UID: "2249aa73-cbf1-4a6b-a893-d2242c236c6d"). InnerVolumeSpecName "kube-api-access-fs2c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.508108 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2249aa73-cbf1-4a6b-a893-d2242c236c6d" (UID: "2249aa73-cbf1-4a6b-a893-d2242c236c6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.529217 4751 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2249aa73-cbf1-4a6b-a893-d2242c236c6d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.529251 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs2c8\" (UniqueName: \"kubernetes.io/projected/2249aa73-cbf1-4a6b-a893-d2242c236c6d-kube-api-access-fs2c8\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.529263 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.529271 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.529283 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.529293 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2249aa73-cbf1-4a6b-a893-d2242c236c6d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.542634 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data" (OuterVolumeSpecName: "config-data") pod "2249aa73-cbf1-4a6b-a893-d2242c236c6d" (UID: "2249aa73-cbf1-4a6b-a893-d2242c236c6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.579703 4751 generic.go:334] "Generic (PLEG): container finished" podID="b0cff45f-8db2-44d7-9723-677508c4c442" containerID="24003421e57923c6199a8e1dfb412c6d702c4649da37c29574a520e9fa6f3964" exitCode=0 Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.579750 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerDied","Data":"24003421e57923c6199a8e1dfb412c6d702c4649da37c29574a520e9fa6f3964"} Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.579805 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0cff45f-8db2-44d7-9723-677508c4c442","Type":"ContainerDied","Data":"e45df8f81575c2844bd3abf9a0808a8b86a85ed91293d87fed92f5e964e091ee"} Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.579818 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e45df8f81575c2844bd3abf9a0808a8b86a85ed91293d87fed92f5e964e091ee" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.582693 4751 generic.go:334] "Generic (PLEG): container finished" podID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerID="d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df" exitCode=137 Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.583837 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.595473 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2249aa73-cbf1-4a6b-a893-d2242c236c6d","Type":"ContainerDied","Data":"d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df"} Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.595539 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2249aa73-cbf1-4a6b-a893-d2242c236c6d","Type":"ContainerDied","Data":"bc9a85e2ca81acd206b0152f0bfd261d4e136529a541e8f35475d63be0a0d8de"} Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.595567 4751 scope.go:117] "RemoveContainer" containerID="d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.628603 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.631039 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-run-httpd\") pod \"b0cff45f-8db2-44d7-9723-677508c4c442\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.631113 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96vcf\" (UniqueName: \"kubernetes.io/projected/b0cff45f-8db2-44d7-9723-677508c4c442-kube-api-access-96vcf\") pod \"b0cff45f-8db2-44d7-9723-677508c4c442\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.631137 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-scripts\") pod \"b0cff45f-8db2-44d7-9723-677508c4c442\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.631165 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-config-data\") pod \"b0cff45f-8db2-44d7-9723-677508c4c442\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.631738 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2249aa73-cbf1-4a6b-a893-d2242c236c6d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.635903 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-scripts" (OuterVolumeSpecName: "scripts") pod "b0cff45f-8db2-44d7-9723-677508c4c442" (UID: "b0cff45f-8db2-44d7-9723-677508c4c442"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.636538 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0cff45f-8db2-44d7-9723-677508c4c442" (UID: "b0cff45f-8db2-44d7-9723-677508c4c442"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.641129 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.650999 4751 scope.go:117] "RemoveContainer" containerID="795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.654159 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.692905 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:37:03 crc kubenswrapper[4751]: E1203 14:37:03.693650 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="proxy-httpd" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.693676 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="proxy-httpd" Dec 03 14:37:03 crc kubenswrapper[4751]: E1203 14:37:03.693704 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerName="cinder-api-log" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.693713 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerName="cinder-api-log" Dec 03 14:37:03 crc kubenswrapper[4751]: E1203 14:37:03.693728 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerName="cinder-api" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.693738 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerName="cinder-api" Dec 03 14:37:03 crc kubenswrapper[4751]: E1203 14:37:03.693763 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="ceilometer-central-agent" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.693772 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="ceilometer-central-agent" Dec 03 14:37:03 crc kubenswrapper[4751]: E1203 14:37:03.693792 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="ceilometer-notification-agent" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.693800 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="ceilometer-notification-agent" Dec 03 14:37:03 crc kubenswrapper[4751]: E1203 14:37:03.693814 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="sg-core" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.693820 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="sg-core" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.694143 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="sg-core" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.694171 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerName="cinder-api" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.694192 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="ceilometer-central-agent" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.694205 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerName="cinder-api-log" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.694221 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="proxy-httpd" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.694234 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" containerName="ceilometer-notification-agent" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.694303 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cff45f-8db2-44d7-9723-677508c4c442-kube-api-access-96vcf" (OuterVolumeSpecName: "kube-api-access-96vcf") pod "b0cff45f-8db2-44d7-9723-677508c4c442" (UID: "b0cff45f-8db2-44d7-9723-677508c4c442"). InnerVolumeSpecName "kube-api-access-96vcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.695659 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.700046 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.700374 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.701352 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.740492 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-combined-ca-bundle\") pod \"b0cff45f-8db2-44d7-9723-677508c4c442\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.740544 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-sg-core-conf-yaml\") pod \"b0cff45f-8db2-44d7-9723-677508c4c442\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.740742 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-log-httpd\") pod \"b0cff45f-8db2-44d7-9723-677508c4c442\" (UID: \"b0cff45f-8db2-44d7-9723-677508c4c442\") " Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.741424 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afa6e4a5-811b-43db-868b-66a71bff4830-etc-machine-id\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.741534 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-config-data\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.741575 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-scripts\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.741612 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afa6e4a5-811b-43db-868b-66a71bff4830-logs\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.741662 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-config-data-custom\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.741727 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-public-tls-certs\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.741816 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.741907 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.741936 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnb9f\" (UniqueName: \"kubernetes.io/projected/afa6e4a5-811b-43db-868b-66a71bff4830-kube-api-access-tnb9f\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.742104 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.742125 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96vcf\" (UniqueName: \"kubernetes.io/projected/b0cff45f-8db2-44d7-9723-677508c4c442-kube-api-access-96vcf\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.742137 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.752940 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0cff45f-8db2-44d7-9723-677508c4c442" (UID: "b0cff45f-8db2-44d7-9723-677508c4c442"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.754125 4751 scope.go:117] "RemoveContainer" containerID="d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df" Dec 03 14:37:03 crc kubenswrapper[4751]: E1203 14:37:03.754544 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df\": container with ID starting with d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df not found: ID does not exist" containerID="d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.754591 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df"} err="failed to get container status \"d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df\": rpc error: code = NotFound desc = could not find container \"d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df\": container with ID starting with d36384954ae7427b0fcbac8f30e7472232a5581f03dfc3949434110d85dc24df not found: ID does not exist" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.754615 4751 scope.go:117] "RemoveContainer" containerID="795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b" Dec 03 14:37:03 crc kubenswrapper[4751]: E1203 14:37:03.755711 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b\": container with ID starting with 795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b not found: ID does not exist" containerID="795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.755763 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b"} err="failed to get container status \"795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b\": rpc error: code = NotFound desc = could not find container \"795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b\": container with ID starting with 795b78469f6354cd4a26cbbe12c95fdda79be13ad1553f124e3eb1441894a03b not found: ID does not exist" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.767498 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.777838 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b0cff45f-8db2-44d7-9723-677508c4c442" (UID: "b0cff45f-8db2-44d7-9723-677508c4c442"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.820318 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-config-data" (OuterVolumeSpecName: "config-data") pod "b0cff45f-8db2-44d7-9723-677508c4c442" (UID: "b0cff45f-8db2-44d7-9723-677508c4c442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845367 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845465 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845489 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnb9f\" (UniqueName: \"kubernetes.io/projected/afa6e4a5-811b-43db-868b-66a71bff4830-kube-api-access-tnb9f\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845558 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afa6e4a5-811b-43db-868b-66a71bff4830-etc-machine-id\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845586 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-config-data\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845617 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-scripts\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845639 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afa6e4a5-811b-43db-868b-66a71bff4830-logs\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845667 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-config-data-custom\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845699 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-public-tls-certs\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845771 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845781 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cff45f-8db2-44d7-9723-677508c4c442-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.845790 4751 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.847611 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afa6e4a5-811b-43db-868b-66a71bff4830-etc-machine-id\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.848226 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afa6e4a5-811b-43db-868b-66a71bff4830-logs\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.850657 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-scripts\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.851355 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.851379 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-public-tls-certs\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.851593 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-config-data-custom\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.853727 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-config-data\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.855529 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0cff45f-8db2-44d7-9723-677508c4c442" (UID: "b0cff45f-8db2-44d7-9723-677508c4c442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.859625 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa6e4a5-811b-43db-868b-66a71bff4830-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.869712 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnb9f\" (UniqueName: \"kubernetes.io/projected/afa6e4a5-811b-43db-868b-66a71bff4830-kube-api-access-tnb9f\") pod \"cinder-api-0\" (UID: \"afa6e4a5-811b-43db-868b-66a71bff4830\") " pod="openstack/cinder-api-0" Dec 03 14:37:03 crc kubenswrapper[4751]: I1203 14:37:03.947603 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cff45f-8db2-44d7-9723-677508c4c442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.034724 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.134093 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-kg74g"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.135927 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.149142 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kg74g"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.149769 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14804484-7da3-4876-b9f5-a3f996fdca5c-operator-scripts\") pod \"nova-api-db-create-kg74g\" (UID: \"14804484-7da3-4876-b9f5-a3f996fdca5c\") " pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.149823 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhwb6\" (UniqueName: \"kubernetes.io/projected/14804484-7da3-4876-b9f5-a3f996fdca5c-kube-api-access-nhwb6\") pod \"nova-api-db-create-kg74g\" (UID: \"14804484-7da3-4876-b9f5-a3f996fdca5c\") " pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.251279 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14804484-7da3-4876-b9f5-a3f996fdca5c-operator-scripts\") pod \"nova-api-db-create-kg74g\" (UID: \"14804484-7da3-4876-b9f5-a3f996fdca5c\") " pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.251963 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhwb6\" (UniqueName: \"kubernetes.io/projected/14804484-7da3-4876-b9f5-a3f996fdca5c-kube-api-access-nhwb6\") pod \"nova-api-db-create-kg74g\" (UID: \"14804484-7da3-4876-b9f5-a3f996fdca5c\") " pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.252554 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14804484-7da3-4876-b9f5-a3f996fdca5c-operator-scripts\") pod \"nova-api-db-create-kg74g\" (UID: \"14804484-7da3-4876-b9f5-a3f996fdca5c\") " pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.254736 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rjchq"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.256233 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.279228 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhwb6\" (UniqueName: \"kubernetes.io/projected/14804484-7da3-4876-b9f5-a3f996fdca5c-kube-api-access-nhwb6\") pod \"nova-api-db-create-kg74g\" (UID: \"14804484-7da3-4876-b9f5-a3f996fdca5c\") " pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.283045 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d629-account-create-update-jjx45"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.285683 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.291544 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.302376 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rjchq"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.345555 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d629-account-create-update-jjx45"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.353600 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-operator-scripts\") pod \"nova-cell0-db-create-rjchq\" (UID: \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\") " pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.353732 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6sm\" (UniqueName: \"kubernetes.io/projected/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-kube-api-access-mz6sm\") pod \"nova-cell0-db-create-rjchq\" (UID: \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\") " pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.353755 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smc9q\" (UniqueName: \"kubernetes.io/projected/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-kube-api-access-smc9q\") pod \"nova-api-d629-account-create-update-jjx45\" (UID: \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\") " pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.353813 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-operator-scripts\") pod \"nova-api-d629-account-create-update-jjx45\" (UID: \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\") " pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.386842 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wb89v"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.388670 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.413778 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wb89v"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.455177 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6sm\" (UniqueName: \"kubernetes.io/projected/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-kube-api-access-mz6sm\") pod \"nova-cell0-db-create-rjchq\" (UID: \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\") " pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.455223 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smc9q\" (UniqueName: \"kubernetes.io/projected/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-kube-api-access-smc9q\") pod \"nova-api-d629-account-create-update-jjx45\" (UID: \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\") " pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.455282 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-operator-scripts\") pod \"nova-api-d629-account-create-update-jjx45\" (UID: \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\") " pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.455357 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-operator-scripts\") pod \"nova-cell0-db-create-rjchq\" (UID: \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\") " pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.456153 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-operator-scripts\") pod \"nova-cell0-db-create-rjchq\" (UID: \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\") " pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.458305 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-operator-scripts\") pod \"nova-api-d629-account-create-update-jjx45\" (UID: \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\") " pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.459523 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b6f3-account-create-update-k2rkr"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.459719 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.461276 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.464363 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.477749 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6sm\" (UniqueName: \"kubernetes.io/projected/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-kube-api-access-mz6sm\") pod \"nova-cell0-db-create-rjchq\" (UID: \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\") " pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.478819 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b6f3-account-create-update-k2rkr"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.482134 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smc9q\" (UniqueName: \"kubernetes.io/projected/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-kube-api-access-smc9q\") pod \"nova-api-d629-account-create-update-jjx45\" (UID: \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\") " pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.556961 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-operator-scripts\") pod \"nova-cell1-db-create-wb89v\" (UID: \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\") " pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.557116 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4jvs\" (UniqueName: \"kubernetes.io/projected/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-kube-api-access-d4jvs\") pod \"nova-cell1-db-create-wb89v\" (UID: \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\") " pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.612740 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.648372 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1fbd-account-create-update-wgwqm"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.658633 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b90d17b-c6be-47a0-a72f-669ad0c75e18-operator-scripts\") pod \"nova-cell0-b6f3-account-create-update-k2rkr\" (UID: \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\") " pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.658728 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-operator-scripts\") pod \"nova-cell1-db-create-wb89v\" (UID: \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\") " pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.658815 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2g7r\" (UniqueName: \"kubernetes.io/projected/8b90d17b-c6be-47a0-a72f-669ad0c75e18-kube-api-access-g2g7r\") pod \"nova-cell0-b6f3-account-create-update-k2rkr\" (UID: \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\") " pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.658842 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4jvs\" (UniqueName: \"kubernetes.io/projected/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-kube-api-access-d4jvs\") pod \"nova-cell1-db-create-wb89v\" (UID: \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\") " pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.659240 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.659865 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-operator-scripts\") pod \"nova-cell1-db-create-wb89v\" (UID: \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\") " pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.662593 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.670726 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1fbd-account-create-update-wgwqm"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.691675 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.704767 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4jvs\" (UniqueName: \"kubernetes.io/projected/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-kube-api-access-d4jvs\") pod \"nova-cell1-db-create-wb89v\" (UID: \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\") " pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.725650 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.737129 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:04 crc kubenswrapper[4751]: W1203 14:37:04.753714 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafa6e4a5_811b_43db_868b_66a71bff4830.slice/crio-d4b3b2a72618554848d223ad7a28562f48cc21a4c0e34f75bb87ef3ae0737fa7 WatchSource:0}: Error finding container d4b3b2a72618554848d223ad7a28562f48cc21a4c0e34f75bb87ef3ae0737fa7: Status 404 returned error can't find the container with id d4b3b2a72618554848d223ad7a28562f48cc21a4c0e34f75bb87ef3ae0737fa7 Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.762222 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b90d17b-c6be-47a0-a72f-669ad0c75e18-operator-scripts\") pod \"nova-cell0-b6f3-account-create-update-k2rkr\" (UID: \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\") " pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.762355 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-operator-scripts\") pod \"nova-cell1-1fbd-account-create-update-wgwqm\" (UID: \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\") " pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.762404 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2g7r\" (UniqueName: \"kubernetes.io/projected/8b90d17b-c6be-47a0-a72f-669ad0c75e18-kube-api-access-g2g7r\") pod \"nova-cell0-b6f3-account-create-update-k2rkr\" (UID: \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\") " pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.762467 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlblf\" (UniqueName: \"kubernetes.io/projected/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-kube-api-access-qlblf\") pod \"nova-cell1-1fbd-account-create-update-wgwqm\" (UID: \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\") " pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.763479 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b90d17b-c6be-47a0-a72f-669ad0c75e18-operator-scripts\") pod \"nova-cell0-b6f3-account-create-update-k2rkr\" (UID: \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\") " pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.766896 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.774268 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.777632 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.780614 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.781437 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.781715 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.787123 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2g7r\" (UniqueName: \"kubernetes.io/projected/8b90d17b-c6be-47a0-a72f-669ad0c75e18-kube-api-access-g2g7r\") pod \"nova-cell0-b6f3-account-create-update-k2rkr\" (UID: \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\") " pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.800684 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.864812 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-operator-scripts\") pod \"nova-cell1-1fbd-account-create-update-wgwqm\" (UID: \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\") " pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.864975 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlblf\" (UniqueName: \"kubernetes.io/projected/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-kube-api-access-qlblf\") pod \"nova-cell1-1fbd-account-create-update-wgwqm\" (UID: \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\") " pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.866079 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-operator-scripts\") pod \"nova-cell1-1fbd-account-create-update-wgwqm\" (UID: \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\") " pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.893977 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlblf\" (UniqueName: \"kubernetes.io/projected/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-kube-api-access-qlblf\") pod \"nova-cell1-1fbd-account-create-update-wgwqm\" (UID: \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\") " pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.913689 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.932958 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.967022 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.967262 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.967385 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-config-data\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.967453 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-log-httpd\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.967612 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66n8k\" (UniqueName: \"kubernetes.io/projected/a5677749-e344-42cf-9953-2f8455e5bacc-kube-api-access-66n8k\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.967655 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-scripts\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:04 crc kubenswrapper[4751]: I1203 14:37:04.967688 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-run-httpd\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.005347 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.071385 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66n8k\" (UniqueName: \"kubernetes.io/projected/a5677749-e344-42cf-9953-2f8455e5bacc-kube-api-access-66n8k\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.071444 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-scripts\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.071483 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-run-httpd\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.072512 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.072679 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.072759 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-config-data\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.072810 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-log-httpd\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.073275 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-log-httpd\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.078243 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-run-httpd\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.081370 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.083224 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.089233 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-scripts\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.090001 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-config-data\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.115507 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66n8k\" (UniqueName: \"kubernetes.io/projected/a5677749-e344-42cf-9953-2f8455e5bacc-kube-api-access-66n8k\") pod \"ceilometer-0\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.123042 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kg74g"] Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.340108 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" path="/var/lib/kubelet/pods/2249aa73-cbf1-4a6b-a893-d2242c236c6d/volumes" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.345392 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cff45f-8db2-44d7-9723-677508c4c442" path="/var/lib/kubelet/pods/b0cff45f-8db2-44d7-9723-677508c4c442/volumes" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.364879 4751 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod4b931e90-9037-4b52-92c1-9c1d4d3fbba4"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod4b931e90-9037-4b52-92c1-9c1d4d3fbba4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4b931e90_9037_4b52_92c1_9c1d4d3fbba4.slice" Dec 03 14:37:05 crc kubenswrapper[4751]: E1203 14:37:05.364938 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod4b931e90-9037-4b52-92c1-9c1d4d3fbba4] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod4b931e90-9037-4b52-92c1-9c1d4d3fbba4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4b931e90_9037_4b52_92c1_9c1d4d3fbba4.slice" pod="openstack/barbican-api-55f7778fd-nbkr2" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.406714 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.410524 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rjchq"] Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.638864 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"afa6e4a5-811b-43db-868b-66a71bff4830","Type":"ContainerStarted","Data":"d4b3b2a72618554848d223ad7a28562f48cc21a4c0e34f75bb87ef3ae0737fa7"} Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.641597 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rjchq" event={"ID":"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca","Type":"ContainerStarted","Data":"195a785443906f168aeeed72603e8914e037b02fa8bdd5993b05bb899b19f78a"} Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.643500 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55f7778fd-nbkr2" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.644364 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kg74g" event={"ID":"14804484-7da3-4876-b9f5-a3f996fdca5c","Type":"ContainerStarted","Data":"269c03a6511c4c12c6fa069e8e8a09c8cbcd16954e5f6cadbe48b62f9e75f528"} Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.802755 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wb89v"] Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.819878 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.819942 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.872399 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d629-account-create-update-jjx45"] Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.921308 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55f7778fd-nbkr2"] Dec 03 14:37:05 crc kubenswrapper[4751]: I1203 14:37:05.977300 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-55f7778fd-nbkr2"] Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.014661 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1fbd-account-create-update-wgwqm"] Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.078939 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b6f3-account-create-update-k2rkr"] Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.441940 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:06 crc kubenswrapper[4751]: W1203 14:37:06.454637 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5677749_e344_42cf_9953_2f8455e5bacc.slice/crio-19cc6ede70d1d05785f33e4e336e351e9c8b1ddfed5786b4d0afbd53b1d241a7 WatchSource:0}: Error finding container 19cc6ede70d1d05785f33e4e336e351e9c8b1ddfed5786b4d0afbd53b1d241a7: Status 404 returned error can't find the container with id 19cc6ede70d1d05785f33e4e336e351e9c8b1ddfed5786b4d0afbd53b1d241a7 Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.687803 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wb89v" event={"ID":"0d206c76-a64f-42dc-8bc0-554d21e5ebd0","Type":"ContainerStarted","Data":"5438a700d64967e58768961ff7f29dad0258aa33d36bfba8d852666d1d3b1078"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.688388 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wb89v" event={"ID":"0d206c76-a64f-42dc-8bc0-554d21e5ebd0","Type":"ContainerStarted","Data":"62cebd9e9ccd6be118e0d89f3e59e25a68f10b4aa5e7fe6f20ed03be43fceee1"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.711565 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d629-account-create-update-jjx45" event={"ID":"5c1be9b7-88c6-4c30-8b86-6b3194a6a225","Type":"ContainerStarted","Data":"8a59515ad8337978d1a0f9f51673dd0b11ee05971d42cc6e483edb9ffd7b6709"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.711609 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d629-account-create-update-jjx45" event={"ID":"5c1be9b7-88c6-4c30-8b86-6b3194a6a225","Type":"ContainerStarted","Data":"de3ef40fc2645bc8ba10d1409e8e52c6f886e932de5d58cc6354f3d9fb28ff19"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.726304 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" event={"ID":"8b90d17b-c6be-47a0-a72f-669ad0c75e18","Type":"ContainerStarted","Data":"3408bb39b5a7e06dceaad85508269f0f1dbdaf069edcdb1bc17d8d496372505a"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.727891 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerStarted","Data":"19cc6ede70d1d05785f33e4e336e351e9c8b1ddfed5786b4d0afbd53b1d241a7"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.728820 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" event={"ID":"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc","Type":"ContainerStarted","Data":"f62212c5a2b677160af35a122b1cfcd62d134cf16afbd4864727290fe1e85cbf"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.730417 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"afa6e4a5-811b-43db-868b-66a71bff4830","Type":"ContainerStarted","Data":"593523cc6444b3412a689723aeedde9cff06b5377d71776b5dcaacb322c15bbe"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.732905 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rjchq" event={"ID":"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca","Type":"ContainerStarted","Data":"d51d21a867de6344470db63eb2c384e4e73323da2b6cdf183a8d6f807808ea80"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.737400 4751 generic.go:334] "Generic (PLEG): container finished" podID="14804484-7da3-4876-b9f5-a3f996fdca5c" containerID="dda574fe47d3ecdf3017ff88709fdadda5d6e3c421d008a8019d5ea8feac7bb9" exitCode=0 Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.737445 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kg74g" event={"ID":"14804484-7da3-4876-b9f5-a3f996fdca5c","Type":"ContainerDied","Data":"dda574fe47d3ecdf3017ff88709fdadda5d6e3c421d008a8019d5ea8feac7bb9"} Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.740847 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-wb89v" podStartSLOduration=2.740830311 podStartE2EDuration="2.740830311s" podCreationTimestamp="2025-12-03 14:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:37:06.713038224 +0000 UTC m=+1433.701393441" watchObservedRunningTime="2025-12-03 14:37:06.740830311 +0000 UTC m=+1433.729185528" Dec 03 14:37:06 crc kubenswrapper[4751]: I1203 14:37:06.745545 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-d629-account-create-update-jjx45" podStartSLOduration=2.745526905 podStartE2EDuration="2.745526905s" podCreationTimestamp="2025-12-03 14:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:37:06.736925127 +0000 UTC m=+1433.725280344" watchObservedRunningTime="2025-12-03 14:37:06.745526905 +0000 UTC m=+1433.733882122" Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.326922 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b931e90-9037-4b52-92c1-9c1d4d3fbba4" path="/var/lib/kubelet/pods/4b931e90-9037-4b52-92c1-9c1d4d3fbba4/volumes" Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.445350 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.749629 4751 generic.go:334] "Generic (PLEG): container finished" podID="8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca" containerID="d51d21a867de6344470db63eb2c384e4e73323da2b6cdf183a8d6f807808ea80" exitCode=0 Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.749870 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rjchq" event={"ID":"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca","Type":"ContainerDied","Data":"d51d21a867de6344470db63eb2c384e4e73323da2b6cdf183a8d6f807808ea80"} Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.769662 4751 generic.go:334] "Generic (PLEG): container finished" podID="0d206c76-a64f-42dc-8bc0-554d21e5ebd0" containerID="5438a700d64967e58768961ff7f29dad0258aa33d36bfba8d852666d1d3b1078" exitCode=0 Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.769755 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wb89v" event={"ID":"0d206c76-a64f-42dc-8bc0-554d21e5ebd0","Type":"ContainerDied","Data":"5438a700d64967e58768961ff7f29dad0258aa33d36bfba8d852666d1d3b1078"} Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.800648 4751 generic.go:334] "Generic (PLEG): container finished" podID="5c1be9b7-88c6-4c30-8b86-6b3194a6a225" containerID="8a59515ad8337978d1a0f9f51673dd0b11ee05971d42cc6e483edb9ffd7b6709" exitCode=0 Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.800772 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d629-account-create-update-jjx45" event={"ID":"5c1be9b7-88c6-4c30-8b86-6b3194a6a225","Type":"ContainerDied","Data":"8a59515ad8337978d1a0f9f51673dd0b11ee05971d42cc6e483edb9ffd7b6709"} Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.831928 4751 generic.go:334] "Generic (PLEG): container finished" podID="8b90d17b-c6be-47a0-a72f-669ad0c75e18" containerID="c200e104b10c75481a28bbf12e9c5a1364b8594e64931b14d113348be698099c" exitCode=0 Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.832017 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" event={"ID":"8b90d17b-c6be-47a0-a72f-669ad0c75e18","Type":"ContainerDied","Data":"c200e104b10c75481a28bbf12e9c5a1364b8594e64931b14d113348be698099c"} Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.833706 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerStarted","Data":"9975ca1e958cc4bc981e0f5df51a22e3e4cbb50b6083f294d73b7dcda5e275aa"} Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.834698 4751 generic.go:334] "Generic (PLEG): container finished" podID="8ac653c7-0be2-4943-ba54-d1e1fed5bfdc" containerID="11f039596c97f40659e89fd38dc13bbc8e5349c3783d348023ee6a1fe9a639df" exitCode=0 Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.834742 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" event={"ID":"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc","Type":"ContainerDied","Data":"11f039596c97f40659e89fd38dc13bbc8e5349c3783d348023ee6a1fe9a639df"} Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.843368 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"afa6e4a5-811b-43db-868b-66a71bff4830","Type":"ContainerStarted","Data":"e2be7af37761dfd4776887830c1d998b1809085722132ec8ae5dc15315ce7f75"} Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.843536 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 14:37:07 crc kubenswrapper[4751]: I1203 14:37:07.995256 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.995234271 podStartE2EDuration="4.995234271s" podCreationTimestamp="2025-12-03 14:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:37:07.956830502 +0000 UTC m=+1434.945185729" watchObservedRunningTime="2025-12-03 14:37:07.995234271 +0000 UTC m=+1434.983589478" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.255909 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2249aa73-cbf1-4a6b-a893-d2242c236c6d" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.185:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.381797 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.469542 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz6sm\" (UniqueName: \"kubernetes.io/projected/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-kube-api-access-mz6sm\") pod \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\" (UID: \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\") " Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.469631 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-operator-scripts\") pod \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\" (UID: \"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca\") " Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.471103 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca" (UID: "8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.479878 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-kube-api-access-mz6sm" (OuterVolumeSpecName: "kube-api-access-mz6sm") pod "8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca" (UID: "8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca"). InnerVolumeSpecName "kube-api-access-mz6sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.530460 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.530729 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerName="glance-log" containerID="cri-o://8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060" gracePeriod=30 Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.530870 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerName="glance-httpd" containerID="cri-o://213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11" gracePeriod=30 Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.554722 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.572587 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz6sm\" (UniqueName: \"kubernetes.io/projected/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-kube-api-access-mz6sm\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.572623 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.673845 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14804484-7da3-4876-b9f5-a3f996fdca5c-operator-scripts\") pod \"14804484-7da3-4876-b9f5-a3f996fdca5c\" (UID: \"14804484-7da3-4876-b9f5-a3f996fdca5c\") " Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.674034 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhwb6\" (UniqueName: \"kubernetes.io/projected/14804484-7da3-4876-b9f5-a3f996fdca5c-kube-api-access-nhwb6\") pod \"14804484-7da3-4876-b9f5-a3f996fdca5c\" (UID: \"14804484-7da3-4876-b9f5-a3f996fdca5c\") " Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.674394 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14804484-7da3-4876-b9f5-a3f996fdca5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14804484-7da3-4876-b9f5-a3f996fdca5c" (UID: "14804484-7da3-4876-b9f5-a3f996fdca5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.674624 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14804484-7da3-4876-b9f5-a3f996fdca5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.678514 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14804484-7da3-4876-b9f5-a3f996fdca5c-kube-api-access-nhwb6" (OuterVolumeSpecName: "kube-api-access-nhwb6") pod "14804484-7da3-4876-b9f5-a3f996fdca5c" (UID: "14804484-7da3-4876-b9f5-a3f996fdca5c"). InnerVolumeSpecName "kube-api-access-nhwb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.776677 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhwb6\" (UniqueName: \"kubernetes.io/projected/14804484-7da3-4876-b9f5-a3f996fdca5c-kube-api-access-nhwb6\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.855101 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rjchq" event={"ID":"8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca","Type":"ContainerDied","Data":"195a785443906f168aeeed72603e8914e037b02fa8bdd5993b05bb899b19f78a"} Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.855116 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rjchq" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.855137 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="195a785443906f168aeeed72603e8914e037b02fa8bdd5993b05bb899b19f78a" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.857424 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kg74g" event={"ID":"14804484-7da3-4876-b9f5-a3f996fdca5c","Type":"ContainerDied","Data":"269c03a6511c4c12c6fa069e8e8a09c8cbcd16954e5f6cadbe48b62f9e75f528"} Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.857454 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="269c03a6511c4c12c6fa069e8e8a09c8cbcd16954e5f6cadbe48b62f9e75f528" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.857460 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kg74g" Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.859863 4751 generic.go:334] "Generic (PLEG): container finished" podID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerID="8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060" exitCode=143 Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.859906 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7","Type":"ContainerDied","Data":"8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060"} Dec 03 14:37:08 crc kubenswrapper[4751]: I1203 14:37:08.862060 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerStarted","Data":"36f089dd2c917db47de913ff3ada61cc4274bfa92ef031abbb92e8fe6ccd8daa"} Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.239782 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.391448 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-operator-scripts\") pod \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\" (UID: \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\") " Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.391658 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlblf\" (UniqueName: \"kubernetes.io/projected/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-kube-api-access-qlblf\") pod \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\" (UID: \"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc\") " Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.393367 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ac653c7-0be2-4943-ba54-d1e1fed5bfdc" (UID: "8ac653c7-0be2-4943-ba54-d1e1fed5bfdc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.401678 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-kube-api-access-qlblf" (OuterVolumeSpecName: "kube-api-access-qlblf") pod "8ac653c7-0be2-4943-ba54-d1e1fed5bfdc" (UID: "8ac653c7-0be2-4943-ba54-d1e1fed5bfdc"). InnerVolumeSpecName "kube-api-access-qlblf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.494005 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlblf\" (UniqueName: \"kubernetes.io/projected/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-kube-api-access-qlblf\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.494372 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.684184 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.690045 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.696170 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.798757 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b90d17b-c6be-47a0-a72f-669ad0c75e18-operator-scripts\") pod \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\" (UID: \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\") " Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.798871 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smc9q\" (UniqueName: \"kubernetes.io/projected/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-kube-api-access-smc9q\") pod \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\" (UID: \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\") " Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.798913 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2g7r\" (UniqueName: \"kubernetes.io/projected/8b90d17b-c6be-47a0-a72f-669ad0c75e18-kube-api-access-g2g7r\") pod \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\" (UID: \"8b90d17b-c6be-47a0-a72f-669ad0c75e18\") " Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.798944 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-operator-scripts\") pod \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\" (UID: \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\") " Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.798981 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-operator-scripts\") pod \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\" (UID: \"5c1be9b7-88c6-4c30-8b86-6b3194a6a225\") " Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.799094 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4jvs\" (UniqueName: \"kubernetes.io/projected/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-kube-api-access-d4jvs\") pod \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\" (UID: \"0d206c76-a64f-42dc-8bc0-554d21e5ebd0\") " Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.800679 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d206c76-a64f-42dc-8bc0-554d21e5ebd0" (UID: "0d206c76-a64f-42dc-8bc0-554d21e5ebd0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.800894 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c1be9b7-88c6-4c30-8b86-6b3194a6a225" (UID: "5c1be9b7-88c6-4c30-8b86-6b3194a6a225"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.801262 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b90d17b-c6be-47a0-a72f-669ad0c75e18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b90d17b-c6be-47a0-a72f-669ad0c75e18" (UID: "8b90d17b-c6be-47a0-a72f-669ad0c75e18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.812461 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b90d17b-c6be-47a0-a72f-669ad0c75e18-kube-api-access-g2g7r" (OuterVolumeSpecName: "kube-api-access-g2g7r") pod "8b90d17b-c6be-47a0-a72f-669ad0c75e18" (UID: "8b90d17b-c6be-47a0-a72f-669ad0c75e18"). InnerVolumeSpecName "kube-api-access-g2g7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.814516 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-kube-api-access-smc9q" (OuterVolumeSpecName: "kube-api-access-smc9q") pod "5c1be9b7-88c6-4c30-8b86-6b3194a6a225" (UID: "5c1be9b7-88c6-4c30-8b86-6b3194a6a225"). InnerVolumeSpecName "kube-api-access-smc9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.814637 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-kube-api-access-d4jvs" (OuterVolumeSpecName: "kube-api-access-d4jvs") pod "0d206c76-a64f-42dc-8bc0-554d21e5ebd0" (UID: "0d206c76-a64f-42dc-8bc0-554d21e5ebd0"). InnerVolumeSpecName "kube-api-access-d4jvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.880704 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wb89v" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.880781 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wb89v" event={"ID":"0d206c76-a64f-42dc-8bc0-554d21e5ebd0","Type":"ContainerDied","Data":"62cebd9e9ccd6be118e0d89f3e59e25a68f10b4aa5e7fe6f20ed03be43fceee1"} Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.880828 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62cebd9e9ccd6be118e0d89f3e59e25a68f10b4aa5e7fe6f20ed03be43fceee1" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.883037 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d629-account-create-update-jjx45" event={"ID":"5c1be9b7-88c6-4c30-8b86-6b3194a6a225","Type":"ContainerDied","Data":"de3ef40fc2645bc8ba10d1409e8e52c6f886e932de5d58cc6354f3d9fb28ff19"} Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.883073 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de3ef40fc2645bc8ba10d1409e8e52c6f886e932de5d58cc6354f3d9fb28ff19" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.883075 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d629-account-create-update-jjx45" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.885180 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" event={"ID":"8b90d17b-c6be-47a0-a72f-669ad0c75e18","Type":"ContainerDied","Data":"3408bb39b5a7e06dceaad85508269f0f1dbdaf069edcdb1bc17d8d496372505a"} Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.885203 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b6f3-account-create-update-k2rkr" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.885221 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3408bb39b5a7e06dceaad85508269f0f1dbdaf069edcdb1bc17d8d496372505a" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.895344 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerStarted","Data":"30534927c508640ffb69a006bdfb4947051d7419544a169322d10dcceef1d7f0"} Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.898555 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" event={"ID":"8ac653c7-0be2-4943-ba54-d1e1fed5bfdc","Type":"ContainerDied","Data":"f62212c5a2b677160af35a122b1cfcd62d134cf16afbd4864727290fe1e85cbf"} Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.898645 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f62212c5a2b677160af35a122b1cfcd62d134cf16afbd4864727290fe1e85cbf" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.898756 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1fbd-account-create-update-wgwqm" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.901823 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b90d17b-c6be-47a0-a72f-669ad0c75e18-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.901872 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smc9q\" (UniqueName: \"kubernetes.io/projected/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-kube-api-access-smc9q\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.901889 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2g7r\" (UniqueName: \"kubernetes.io/projected/8b90d17b-c6be-47a0-a72f-669ad0c75e18-kube-api-access-g2g7r\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.901902 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.901913 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c1be9b7-88c6-4c30-8b86-6b3194a6a225-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:09 crc kubenswrapper[4751]: I1203 14:37:09.901926 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4jvs\" (UniqueName: \"kubernetes.io/projected/0d206c76-a64f-42dc-8bc0-554d21e5ebd0-kube-api-access-d4jvs\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:10 crc kubenswrapper[4751]: I1203 14:37:10.665224 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:37:10 crc kubenswrapper[4751]: I1203 14:37:10.666820 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-84ff798d87-5c96l" Dec 03 14:37:11 crc kubenswrapper[4751]: I1203 14:37:11.936306 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerStarted","Data":"05d5253453438458652023650260f255514ba6d4730c8a11872caedcf77835bb"} Dec 03 14:37:11 crc kubenswrapper[4751]: I1203 14:37:11.936475 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="ceilometer-central-agent" containerID="cri-o://9975ca1e958cc4bc981e0f5df51a22e3e4cbb50b6083f294d73b7dcda5e275aa" gracePeriod=30 Dec 03 14:37:11 crc kubenswrapper[4751]: I1203 14:37:11.936548 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="sg-core" containerID="cri-o://30534927c508640ffb69a006bdfb4947051d7419544a169322d10dcceef1d7f0" gracePeriod=30 Dec 03 14:37:11 crc kubenswrapper[4751]: I1203 14:37:11.936584 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="ceilometer-notification-agent" containerID="cri-o://36f089dd2c917db47de913ff3ada61cc4274bfa92ef031abbb92e8fe6ccd8daa" gracePeriod=30 Dec 03 14:37:11 crc kubenswrapper[4751]: I1203 14:37:11.936607 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="proxy-httpd" containerID="cri-o://05d5253453438458652023650260f255514ba6d4730c8a11872caedcf77835bb" gracePeriod=30 Dec 03 14:37:11 crc kubenswrapper[4751]: I1203 14:37:11.936927 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:37:11 crc kubenswrapper[4751]: I1203 14:37:11.967306 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.727549002 podStartE2EDuration="7.967284032s" podCreationTimestamp="2025-12-03 14:37:04 +0000 UTC" firstStartedPulling="2025-12-03 14:37:06.45726643 +0000 UTC m=+1433.445621647" lastFinishedPulling="2025-12-03 14:37:10.69700146 +0000 UTC m=+1437.685356677" observedRunningTime="2025-12-03 14:37:11.956217129 +0000 UTC m=+1438.944572366" watchObservedRunningTime="2025-12-03 14:37:11.967284032 +0000 UTC m=+1438.955639249" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.810113 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.854896 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-logs\") pod \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.854969 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpmq6\" (UniqueName: \"kubernetes.io/projected/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-kube-api-access-wpmq6\") pod \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.855174 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.855230 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-scripts\") pod \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.855280 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-public-tls-certs\") pod \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.855345 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-config-data\") pod \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.855374 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-combined-ca-bundle\") pod \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.855465 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-logs" (OuterVolumeSpecName: "logs") pod "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" (UID: "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.855540 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-httpd-run\") pod \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\" (UID: \"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7\") " Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.856110 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.856400 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" (UID: "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.863760 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-kube-api-access-wpmq6" (OuterVolumeSpecName: "kube-api-access-wpmq6") pod "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" (UID: "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7"). InnerVolumeSpecName "kube-api-access-wpmq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.899774 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6" (OuterVolumeSpecName: "glance") pod "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" (UID: "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7"). InnerVolumeSpecName "pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.912749 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-scripts" (OuterVolumeSpecName: "scripts") pod "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" (UID: "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.919658 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" (UID: "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.947439 4751 generic.go:334] "Generic (PLEG): container finished" podID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerID="213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11" exitCode=0 Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.947503 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7","Type":"ContainerDied","Data":"213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11"} Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.947529 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0d746a7a-b7c6-47d1-b4cf-6e424fb565d7","Type":"ContainerDied","Data":"cf3d3bba20dc3e8746e50cfb9c67342a389c06688b9bd339e35820fdfd22d702"} Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.947544 4751 scope.go:117] "RemoveContainer" containerID="213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.947581 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.961097 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpmq6\" (UniqueName: \"kubernetes.io/projected/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-kube-api-access-wpmq6\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.961156 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") on node \"crc\" " Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.961172 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.961184 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:12 crc kubenswrapper[4751]: I1203 14:37:12.961195 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.021733 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5677749-e344-42cf-9953-2f8455e5bacc" containerID="05d5253453438458652023650260f255514ba6d4730c8a11872caedcf77835bb" exitCode=0 Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.021772 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5677749-e344-42cf-9953-2f8455e5bacc" containerID="30534927c508640ffb69a006bdfb4947051d7419544a169322d10dcceef1d7f0" exitCode=2 Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.021780 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5677749-e344-42cf-9953-2f8455e5bacc" containerID="36f089dd2c917db47de913ff3ada61cc4274bfa92ef031abbb92e8fe6ccd8daa" exitCode=0 Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.021799 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerDied","Data":"05d5253453438458652023650260f255514ba6d4730c8a11872caedcf77835bb"} Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.021827 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerDied","Data":"30534927c508640ffb69a006bdfb4947051d7419544a169322d10dcceef1d7f0"} Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.021859 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerDied","Data":"36f089dd2c917db47de913ff3ada61cc4274bfa92ef031abbb92e8fe6ccd8daa"} Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.047576 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" (UID: "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.090107 4751 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.138592 4751 scope.go:117] "RemoveContainer" containerID="8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.147752 4751 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.147944 4751 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6") on node "crc" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.190454 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-config-data" (OuterVolumeSpecName: "config-data") pod "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" (UID: "0d746a7a-b7c6-47d1-b4cf-6e424fb565d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.191991 4751 reconciler_common.go:293] "Volume detached for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.192038 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.193538 4751 scope.go:117] "RemoveContainer" containerID="213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11" Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.197495 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11\": container with ID starting with 213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11 not found: ID does not exist" containerID="213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.197544 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11"} err="failed to get container status \"213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11\": rpc error: code = NotFound desc = could not find container \"213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11\": container with ID starting with 213f3abea670d09075a44aaf4b02a870150762baace3817d75c365bae632ca11 not found: ID does not exist" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.197573 4751 scope.go:117] "RemoveContainer" containerID="8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060" Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.205498 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060\": container with ID starting with 8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060 not found: ID does not exist" containerID="8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.205547 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060"} err="failed to get container status \"8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060\": rpc error: code = NotFound desc = could not find container \"8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060\": container with ID starting with 8ecd12696dd1ff10d3b60df57fc95800b104034b7e5b7d8c19398d87bbce3060 not found: ID does not exist" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.287537 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.298135 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.338068 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" path="/var/lib/kubelet/pods/0d746a7a-b7c6-47d1-b4cf-6e424fb565d7/volumes" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.338864 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.339249 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14804484-7da3-4876-b9f5-a3f996fdca5c" containerName="mariadb-database-create" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339269 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="14804484-7da3-4876-b9f5-a3f996fdca5c" containerName="mariadb-database-create" Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.339298 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac653c7-0be2-4943-ba54-d1e1fed5bfdc" containerName="mariadb-account-create-update" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339307 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac653c7-0be2-4943-ba54-d1e1fed5bfdc" containerName="mariadb-account-create-update" Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.339341 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1be9b7-88c6-4c30-8b86-6b3194a6a225" containerName="mariadb-account-create-update" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339353 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1be9b7-88c6-4c30-8b86-6b3194a6a225" containerName="mariadb-account-create-update" Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.339379 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d206c76-a64f-42dc-8bc0-554d21e5ebd0" containerName="mariadb-database-create" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339387 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d206c76-a64f-42dc-8bc0-554d21e5ebd0" containerName="mariadb-database-create" Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.339397 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca" containerName="mariadb-database-create" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339406 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca" containerName="mariadb-database-create" Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.339427 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerName="glance-httpd" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339435 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerName="glance-httpd" Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.339451 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerName="glance-log" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339458 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerName="glance-log" Dec 03 14:37:13 crc kubenswrapper[4751]: E1203 14:37:13.339468 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b90d17b-c6be-47a0-a72f-669ad0c75e18" containerName="mariadb-account-create-update" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339476 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b90d17b-c6be-47a0-a72f-669ad0c75e18" containerName="mariadb-account-create-update" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339725 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c1be9b7-88c6-4c30-8b86-6b3194a6a225" containerName="mariadb-account-create-update" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339744 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d206c76-a64f-42dc-8bc0-554d21e5ebd0" containerName="mariadb-database-create" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339754 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac653c7-0be2-4943-ba54-d1e1fed5bfdc" containerName="mariadb-account-create-update" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339768 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b90d17b-c6be-47a0-a72f-669ad0c75e18" containerName="mariadb-account-create-update" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339796 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca" containerName="mariadb-database-create" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339812 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerName="glance-log" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339822 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="14804484-7da3-4876-b9f5-a3f996fdca5c" containerName="mariadb-database-create" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.339831 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d746a7a-b7c6-47d1-b4cf-6e424fb565d7" containerName="glance-httpd" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.345464 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.345599 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.348603 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.348657 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.498035 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac247305-666d-4241-b756-88499fd359ad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.498214 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfct\" (UniqueName: \"kubernetes.io/projected/ac247305-666d-4241-b756-88499fd359ad-kube-api-access-ddfct\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.498256 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.498340 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.498377 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.498692 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.498753 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.498866 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac247305-666d-4241-b756-88499fd359ad-logs\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.600427 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfct\" (UniqueName: \"kubernetes.io/projected/ac247305-666d-4241-b756-88499fd359ad-kube-api-access-ddfct\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.600481 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.600543 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.600567 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.600630 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.600667 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.600727 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac247305-666d-4241-b756-88499fd359ad-logs\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.600768 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac247305-666d-4241-b756-88499fd359ad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.601228 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac247305-666d-4241-b756-88499fd359ad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.601483 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac247305-666d-4241-b756-88499fd359ad-logs\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.605221 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.605260 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/683a5c301520fbc15dbc2ee54d0de9b6296d1615a3bef9c1741aa8a387a031dd/globalmount\"" pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.605291 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.606414 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.606620 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.614810 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac247305-666d-4241-b756-88499fd359ad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.621941 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfct\" (UniqueName: \"kubernetes.io/projected/ac247305-666d-4241-b756-88499fd359ad-kube-api-access-ddfct\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.647191 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c7f5eb-6a57-45d0-b73c-50f46df8b1e6\") pod \"glance-default-external-api-0\" (UID: \"ac247305-666d-4241-b756-88499fd359ad\") " pod="openstack/glance-default-external-api-0" Dec 03 14:37:13 crc kubenswrapper[4751]: I1203 14:37:13.669648 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.253897 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.709467 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7bpp"] Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.711126 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.714291 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.714570 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.714692 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-28c95" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.729713 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7bpp"] Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.825807 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-config-data\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.825895 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbsmh\" (UniqueName: \"kubernetes.io/projected/7df882c6-828a-4aac-8aa9-811102008952-kube-api-access-lbsmh\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.826018 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.826427 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-scripts\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.928434 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-config-data\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.928828 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbsmh\" (UniqueName: \"kubernetes.io/projected/7df882c6-828a-4aac-8aa9-811102008952-kube-api-access-lbsmh\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.928878 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.929057 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-scripts\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.941982 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-config-data\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.943687 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-scripts\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.946232 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:14 crc kubenswrapper[4751]: I1203 14:37:14.958066 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbsmh\" (UniqueName: \"kubernetes.io/projected/7df882c6-828a-4aac-8aa9-811102008952-kube-api-access-lbsmh\") pod \"nova-cell0-conductor-db-sync-h7bpp\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:15 crc kubenswrapper[4751]: I1203 14:37:15.059270 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:15 crc kubenswrapper[4751]: I1203 14:37:15.068945 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac247305-666d-4241-b756-88499fd359ad","Type":"ContainerStarted","Data":"ba6b31c5567566a141805efc005102e204720374abc8b7fc1beca7f3c9625193"} Dec 03 14:37:15 crc kubenswrapper[4751]: I1203 14:37:15.252938 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:37:15 crc kubenswrapper[4751]: I1203 14:37:15.253593 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerName="glance-log" containerID="cri-o://e1ca1f4fb94804711e025ec4962bd4c9fbddd9e59a5f7453aed42cc7775cf35d" gracePeriod=30 Dec 03 14:37:15 crc kubenswrapper[4751]: I1203 14:37:15.253980 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerName="glance-httpd" containerID="cri-o://7b8f252ac2252a13c2b95e00ae25166e9c02cad0ddfa1fb84db0cb35d58a0579" gracePeriod=30 Dec 03 14:37:15 crc kubenswrapper[4751]: I1203 14:37:15.843651 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7bpp"] Dec 03 14:37:16 crc kubenswrapper[4751]: I1203 14:37:16.098133 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h7bpp" event={"ID":"7df882c6-828a-4aac-8aa9-811102008952","Type":"ContainerStarted","Data":"66118c92fd60e0801247e34a9d3db07c662ac238c010d7c69d32041e588bd180"} Dec 03 14:37:16 crc kubenswrapper[4751]: I1203 14:37:16.103354 4751 generic.go:334] "Generic (PLEG): container finished" podID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerID="e1ca1f4fb94804711e025ec4962bd4c9fbddd9e59a5f7453aed42cc7775cf35d" exitCode=143 Dec 03 14:37:16 crc kubenswrapper[4751]: I1203 14:37:16.103397 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a","Type":"ContainerDied","Data":"e1ca1f4fb94804711e025ec4962bd4c9fbddd9e59a5f7453aed42cc7775cf35d"} Dec 03 14:37:16 crc kubenswrapper[4751]: I1203 14:37:16.106414 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac247305-666d-4241-b756-88499fd359ad","Type":"ContainerStarted","Data":"ac16bfbb50a06217593d1a78ce4c3c79f4ab6f213ff7caae6f9dcbe5c29e3636"} Dec 03 14:37:17 crc kubenswrapper[4751]: I1203 14:37:17.126445 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac247305-666d-4241-b756-88499fd359ad","Type":"ContainerStarted","Data":"14795c75e818808326bef9eb4292faf3546916b8fb76fa8e68c67c08b4abd496"} Dec 03 14:37:17 crc kubenswrapper[4751]: I1203 14:37:17.159171 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.159146097 podStartE2EDuration="4.159146097s" podCreationTimestamp="2025-12-03 14:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:37:17.152378088 +0000 UTC m=+1444.140733315" watchObservedRunningTime="2025-12-03 14:37:17.159146097 +0000 UTC m=+1444.147501314" Dec 03 14:37:17 crc kubenswrapper[4751]: E1203 14:37:17.639035 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5677749_e344_42cf_9953_2f8455e5bacc.slice/crio-conmon-9975ca1e958cc4bc981e0f5df51a22e3e4cbb50b6083f294d73b7dcda5e275aa.scope\": RecentStats: unable to find data in memory cache]" Dec 03 14:37:17 crc kubenswrapper[4751]: I1203 14:37:17.929593 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 14:37:18 crc kubenswrapper[4751]: I1203 14:37:18.145483 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5677749-e344-42cf-9953-2f8455e5bacc" containerID="9975ca1e958cc4bc981e0f5df51a22e3e4cbb50b6083f294d73b7dcda5e275aa" exitCode=0 Dec 03 14:37:18 crc kubenswrapper[4751]: I1203 14:37:18.145544 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerDied","Data":"9975ca1e958cc4bc981e0f5df51a22e3e4cbb50b6083f294d73b7dcda5e275aa"} Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.161908 4751 generic.go:334] "Generic (PLEG): container finished" podID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerID="7b8f252ac2252a13c2b95e00ae25166e9c02cad0ddfa1fb84db0cb35d58a0579" exitCode=0 Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.162045 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a","Type":"ContainerDied","Data":"7b8f252ac2252a13c2b95e00ae25166e9c02cad0ddfa1fb84db0cb35d58a0579"} Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.166716 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5677749-e344-42cf-9953-2f8455e5bacc","Type":"ContainerDied","Data":"19cc6ede70d1d05785f33e4e336e351e9c8b1ddfed5786b4d0afbd53b1d241a7"} Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.166756 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19cc6ede70d1d05785f33e4e336e351e9c8b1ddfed5786b4d0afbd53b1d241a7" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.238799 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.368837 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-log-httpd\") pod \"a5677749-e344-42cf-9953-2f8455e5bacc\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.368915 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66n8k\" (UniqueName: \"kubernetes.io/projected/a5677749-e344-42cf-9953-2f8455e5bacc-kube-api-access-66n8k\") pod \"a5677749-e344-42cf-9953-2f8455e5bacc\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.369093 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-run-httpd\") pod \"a5677749-e344-42cf-9953-2f8455e5bacc\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.369154 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-combined-ca-bundle\") pod \"a5677749-e344-42cf-9953-2f8455e5bacc\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.369201 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-sg-core-conf-yaml\") pod \"a5677749-e344-42cf-9953-2f8455e5bacc\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.369227 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-scripts\") pod \"a5677749-e344-42cf-9953-2f8455e5bacc\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.369287 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-config-data\") pod \"a5677749-e344-42cf-9953-2f8455e5bacc\" (UID: \"a5677749-e344-42cf-9953-2f8455e5bacc\") " Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.370668 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a5677749-e344-42cf-9953-2f8455e5bacc" (UID: "a5677749-e344-42cf-9953-2f8455e5bacc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.370977 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a5677749-e344-42cf-9953-2f8455e5bacc" (UID: "a5677749-e344-42cf-9953-2f8455e5bacc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.384557 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5677749-e344-42cf-9953-2f8455e5bacc-kube-api-access-66n8k" (OuterVolumeSpecName: "kube-api-access-66n8k") pod "a5677749-e344-42cf-9953-2f8455e5bacc" (UID: "a5677749-e344-42cf-9953-2f8455e5bacc"). InnerVolumeSpecName "kube-api-access-66n8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.388170 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-scripts" (OuterVolumeSpecName: "scripts") pod "a5677749-e344-42cf-9953-2f8455e5bacc" (UID: "a5677749-e344-42cf-9953-2f8455e5bacc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.426611 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a5677749-e344-42cf-9953-2f8455e5bacc" (UID: "a5677749-e344-42cf-9953-2f8455e5bacc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.471589 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66n8k\" (UniqueName: \"kubernetes.io/projected/a5677749-e344-42cf-9953-2f8455e5bacc-kube-api-access-66n8k\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.471628 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.471641 4751 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.471652 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.471663 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5677749-e344-42cf-9953-2f8455e5bacc-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.537462 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5677749-e344-42cf-9953-2f8455e5bacc" (UID: "a5677749-e344-42cf-9953-2f8455e5bacc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.561363 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-config-data" (OuterVolumeSpecName: "config-data") pod "a5677749-e344-42cf-9953-2f8455e5bacc" (UID: "a5677749-e344-42cf-9953-2f8455e5bacc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.573547 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.573583 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5677749-e344-42cf-9953-2f8455e5bacc-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:19 crc kubenswrapper[4751]: I1203 14:37:19.909888 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.082467 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-httpd-run\") pod \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.082592 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-logs\") pod \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.082672 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rtc\" (UniqueName: \"kubernetes.io/projected/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-kube-api-access-66rtc\") pod \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.082728 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-config-data\") pod \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.082791 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-scripts\") pod \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.082822 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-internal-tls-certs\") pod \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.082877 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-combined-ca-bundle\") pod \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.083025 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\" (UID: \"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a\") " Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.084590 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-logs" (OuterVolumeSpecName: "logs") pod "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" (UID: "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.085827 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" (UID: "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.091103 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-scripts" (OuterVolumeSpecName: "scripts") pod "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" (UID: "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.093612 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-kube-api-access-66rtc" (OuterVolumeSpecName: "kube-api-access-66rtc") pod "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" (UID: "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a"). InnerVolumeSpecName "kube-api-access-66rtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.108293 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308" (OuterVolumeSpecName: "glance") pod "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" (UID: "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a"). InnerVolumeSpecName "pvc-cd3c1578-235e-47c7-b720-afd92ef00308". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.111843 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.122918 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" (UID: "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.163564 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" (UID: "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.177741 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-config-data" (OuterVolumeSpecName: "config-data") pod "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" (UID: "825fe2b2-0938-4ca3-bf04-c7a59aa3e99a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.185465 4751 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.185499 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.185532 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") on node \"crc\" " Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.185544 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.185555 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.185564 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rtc\" (UniqueName: \"kubernetes.io/projected/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-kube-api-access-66rtc\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.185573 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.185580 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.200386 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.201248 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.201292 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"825fe2b2-0938-4ca3-bf04-c7a59aa3e99a","Type":"ContainerDied","Data":"e8267190d2ccb384f28b2eb880c31fbb45afb0c63eabbaa7e67326fe03fd2486"} Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.201368 4751 scope.go:117] "RemoveContainer" containerID="7b8f252ac2252a13c2b95e00ae25166e9c02cad0ddfa1fb84db0cb35d58a0579" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.244647 4751 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.244841 4751 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-cd3c1578-235e-47c7-b720-afd92ef00308" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308") on node "crc" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.265891 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.283644 4751 scope.go:117] "RemoveContainer" containerID="e1ca1f4fb94804711e025ec4962bd4c9fbddd9e59a5f7453aed42cc7775cf35d" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.284167 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.291197 4751 reconciler_common.go:293] "Volume detached for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.296382 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.313476 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.329153 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:20 crc kubenswrapper[4751]: E1203 14:37:20.329678 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="ceilometer-central-agent" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.329699 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="ceilometer-central-agent" Dec 03 14:37:20 crc kubenswrapper[4751]: E1203 14:37:20.329718 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerName="glance-log" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.329725 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerName="glance-log" Dec 03 14:37:20 crc kubenswrapper[4751]: E1203 14:37:20.329738 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="sg-core" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.329753 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="sg-core" Dec 03 14:37:20 crc kubenswrapper[4751]: E1203 14:37:20.329775 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="ceilometer-notification-agent" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.329781 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="ceilometer-notification-agent" Dec 03 14:37:20 crc kubenswrapper[4751]: E1203 14:37:20.329793 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerName="glance-httpd" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.329798 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerName="glance-httpd" Dec 03 14:37:20 crc kubenswrapper[4751]: E1203 14:37:20.329812 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="proxy-httpd" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.329819 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="proxy-httpd" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.330041 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerName="glance-httpd" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.330061 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="proxy-httpd" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.330114 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="sg-core" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.330125 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" containerName="glance-log" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.330137 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="ceilometer-notification-agent" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.330152 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" containerName="ceilometer-central-agent" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.332256 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.338673 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.338910 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.343435 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.346296 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.349183 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.350096 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.356203 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.410300 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.498668 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.498752 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15b1a633-766f-41f8-b9e8-22acc97bf4c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.498782 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15b1a633-766f-41f8-b9e8-22acc97bf4c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.498807 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.498828 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-run-httpd\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.498845 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.498862 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fts8z\" (UniqueName: \"kubernetes.io/projected/68ab5def-2bae-4fe3-b44b-d0c4aadca765-kube-api-access-fts8z\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.498898 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-config-data\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.498976 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrbs4\" (UniqueName: \"kubernetes.io/projected/15b1a633-766f-41f8-b9e8-22acc97bf4c8-kube-api-access-rrbs4\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.499018 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-log-httpd\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.499044 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-scripts\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.499091 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.499155 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.499181 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.499253 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.600896 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-log-httpd\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.600949 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-scripts\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.600993 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601021 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601045 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601197 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601234 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601726 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15b1a633-766f-41f8-b9e8-22acc97bf4c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601754 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15b1a633-766f-41f8-b9e8-22acc97bf4c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601777 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601800 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-run-httpd\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601814 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601845 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fts8z\" (UniqueName: \"kubernetes.io/projected/68ab5def-2bae-4fe3-b44b-d0c4aadca765-kube-api-access-fts8z\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601943 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-config-data\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.602000 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrbs4\" (UniqueName: \"kubernetes.io/projected/15b1a633-766f-41f8-b9e8-22acc97bf4c8-kube-api-access-rrbs4\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.601588 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-log-httpd\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.603291 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15b1a633-766f-41f8-b9e8-22acc97bf4c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.603665 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-run-httpd\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.603708 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15b1a633-766f-41f8-b9e8-22acc97bf4c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.605132 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-scripts\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.607078 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.608414 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.609190 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-config-data\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.610583 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.612951 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.614563 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.618117 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b1a633-766f-41f8-b9e8-22acc97bf4c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.618632 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fts8z\" (UniqueName: \"kubernetes.io/projected/68ab5def-2bae-4fe3-b44b-d0c4aadca765-kube-api-access-fts8z\") pod \"ceilometer-0\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.619949 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.619994 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d3b26c55de8c52fd1f3bf024792f2005b72c5292706e21196d4a11b13179d08d/globalmount\"" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.632141 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrbs4\" (UniqueName: \"kubernetes.io/projected/15b1a633-766f-41f8-b9e8-22acc97bf4c8-kube-api-access-rrbs4\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.662839 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.708174 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd3c1578-235e-47c7-b720-afd92ef00308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd3c1578-235e-47c7-b720-afd92ef00308\") pod \"glance-default-internal-api-0\" (UID: \"15b1a633-766f-41f8-b9e8-22acc97bf4c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:37:20 crc kubenswrapper[4751]: I1203 14:37:20.986979 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:21 crc kubenswrapper[4751]: I1203 14:37:21.246621 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:21 crc kubenswrapper[4751]: W1203 14:37:21.257470 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68ab5def_2bae_4fe3_b44b_d0c4aadca765.slice/crio-0ceb011c1d8c58348a392bb58639b611ebc014d6af3c032f279daa311f0b96a8 WatchSource:0}: Error finding container 0ceb011c1d8c58348a392bb58639b611ebc014d6af3c032f279daa311f0b96a8: Status 404 returned error can't find the container with id 0ceb011c1d8c58348a392bb58639b611ebc014d6af3c032f279daa311f0b96a8 Dec 03 14:37:21 crc kubenswrapper[4751]: I1203 14:37:21.388775 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="825fe2b2-0938-4ca3-bf04-c7a59aa3e99a" path="/var/lib/kubelet/pods/825fe2b2-0938-4ca3-bf04-c7a59aa3e99a/volumes" Dec 03 14:37:21 crc kubenswrapper[4751]: I1203 14:37:21.392393 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5677749-e344-42cf-9953-2f8455e5bacc" path="/var/lib/kubelet/pods/a5677749-e344-42cf-9953-2f8455e5bacc/volumes" Dec 03 14:37:21 crc kubenswrapper[4751]: I1203 14:37:21.655015 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:37:22 crc kubenswrapper[4751]: I1203 14:37:22.269091 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15b1a633-766f-41f8-b9e8-22acc97bf4c8","Type":"ContainerStarted","Data":"147dc7f7c7bcba9d7797ddcb925188780dc034e5c99a3e151d63f4c367f82544"} Dec 03 14:37:22 crc kubenswrapper[4751]: I1203 14:37:22.271943 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerStarted","Data":"1ba7c83cc6fcae2a6de327eb53ffa20bbea6d19c1be4e78d949dc8a5d4d21393"} Dec 03 14:37:22 crc kubenswrapper[4751]: I1203 14:37:22.271984 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerStarted","Data":"0ceb011c1d8c58348a392bb58639b611ebc014d6af3c032f279daa311f0b96a8"} Dec 03 14:37:22 crc kubenswrapper[4751]: I1203 14:37:22.408142 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:23 crc kubenswrapper[4751]: I1203 14:37:23.285561 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerStarted","Data":"1d1cec24e3ad2c6a8d6216ff4797836ae68fa9b1a682e26ea519c0e7e65b3652"} Dec 03 14:37:23 crc kubenswrapper[4751]: I1203 14:37:23.296828 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15b1a633-766f-41f8-b9e8-22acc97bf4c8","Type":"ContainerStarted","Data":"78cb5745d076927033b5ecfacaab8ca5be06aa2335bd6dfabd939403ec7c8215"} Dec 03 14:37:23 crc kubenswrapper[4751]: I1203 14:37:23.296872 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15b1a633-766f-41f8-b9e8-22acc97bf4c8","Type":"ContainerStarted","Data":"62e6c8365e716ef2930966944b8a37757de1d0430d5d12ebff565553a2818e16"} Dec 03 14:37:23 crc kubenswrapper[4751]: I1203 14:37:23.330788 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.330772298 podStartE2EDuration="3.330772298s" podCreationTimestamp="2025-12-03 14:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:37:23.31652011 +0000 UTC m=+1450.304875347" watchObservedRunningTime="2025-12-03 14:37:23.330772298 +0000 UTC m=+1450.319127515" Dec 03 14:37:23 crc kubenswrapper[4751]: I1203 14:37:23.670351 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 14:37:23 crc kubenswrapper[4751]: I1203 14:37:23.670678 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 14:37:23 crc kubenswrapper[4751]: I1203 14:37:23.727465 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 14:37:23 crc kubenswrapper[4751]: I1203 14:37:23.746260 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 14:37:24 crc kubenswrapper[4751]: I1203 14:37:24.352683 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerStarted","Data":"5111d494711d3fc952353cff34682b84c22433e9c6b8f15b6005aa2dbc34034f"} Dec 03 14:37:24 crc kubenswrapper[4751]: I1203 14:37:24.353845 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:37:24 crc kubenswrapper[4751]: I1203 14:37:24.353869 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.194473 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ll7p5"] Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.196992 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.209474 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ll7p5"] Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.315980 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqgpv\" (UniqueName: \"kubernetes.io/projected/229c80e6-2c05-4430-9093-420b6b9aa241-kube-api-access-tqgpv\") pod \"redhat-operators-ll7p5\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.316420 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-utilities\") pod \"redhat-operators-ll7p5\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.316513 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-catalog-content\") pod \"redhat-operators-ll7p5\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.418652 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-utilities\") pod \"redhat-operators-ll7p5\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.419109 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-utilities\") pod \"redhat-operators-ll7p5\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.420131 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-catalog-content\") pod \"redhat-operators-ll7p5\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.420247 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqgpv\" (UniqueName: \"kubernetes.io/projected/229c80e6-2c05-4430-9093-420b6b9aa241-kube-api-access-tqgpv\") pod \"redhat-operators-ll7p5\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.420867 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-catalog-content\") pod \"redhat-operators-ll7p5\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.440649 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqgpv\" (UniqueName: \"kubernetes.io/projected/229c80e6-2c05-4430-9093-420b6b9aa241-kube-api-access-tqgpv\") pod \"redhat-operators-ll7p5\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:25 crc kubenswrapper[4751]: I1203 14:37:25.522131 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:26 crc kubenswrapper[4751]: I1203 14:37:26.372607 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:37:26 crc kubenswrapper[4751]: I1203 14:37:26.372671 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:37:26 crc kubenswrapper[4751]: I1203 14:37:26.643497 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 14:37:26 crc kubenswrapper[4751]: I1203 14:37:26.657516 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 14:37:30 crc kubenswrapper[4751]: I1203 14:37:30.987829 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:30 crc kubenswrapper[4751]: I1203 14:37:30.988387 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:31 crc kubenswrapper[4751]: I1203 14:37:31.035463 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:31 crc kubenswrapper[4751]: I1203 14:37:31.044575 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:31 crc kubenswrapper[4751]: I1203 14:37:31.471347 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:31 crc kubenswrapper[4751]: I1203 14:37:31.472025 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:31 crc kubenswrapper[4751]: I1203 14:37:31.762363 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ll7p5"] Dec 03 14:37:31 crc kubenswrapper[4751]: W1203 14:37:31.767257 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod229c80e6_2c05_4430_9093_420b6b9aa241.slice/crio-8596fc79e8a1489a7cb7db331dce15565f463bf596b03ca08eb0039c849a2686 WatchSource:0}: Error finding container 8596fc79e8a1489a7cb7db331dce15565f463bf596b03ca08eb0039c849a2686: Status 404 returned error can't find the container with id 8596fc79e8a1489a7cb7db331dce15565f463bf596b03ca08eb0039c849a2686 Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.483995 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerStarted","Data":"a884400999616e264f0d5bd2debc788723a0602f4baed37efb42d9f33c14d9f7"} Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.484182 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="ceilometer-central-agent" containerID="cri-o://1ba7c83cc6fcae2a6de327eb53ffa20bbea6d19c1be4e78d949dc8a5d4d21393" gracePeriod=30 Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.484422 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.484446 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="proxy-httpd" containerID="cri-o://a884400999616e264f0d5bd2debc788723a0602f4baed37efb42d9f33c14d9f7" gracePeriod=30 Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.484476 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="ceilometer-notification-agent" containerID="cri-o://1d1cec24e3ad2c6a8d6216ff4797836ae68fa9b1a682e26ea519c0e7e65b3652" gracePeriod=30 Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.484544 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="sg-core" containerID="cri-o://5111d494711d3fc952353cff34682b84c22433e9c6b8f15b6005aa2dbc34034f" gracePeriod=30 Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.488765 4751 generic.go:334] "Generic (PLEG): container finished" podID="229c80e6-2c05-4430-9093-420b6b9aa241" containerID="ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6" exitCode=0 Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.488979 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7p5" event={"ID":"229c80e6-2c05-4430-9093-420b6b9aa241","Type":"ContainerDied","Data":"ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6"} Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.489009 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7p5" event={"ID":"229c80e6-2c05-4430-9093-420b6b9aa241","Type":"ContainerStarted","Data":"8596fc79e8a1489a7cb7db331dce15565f463bf596b03ca08eb0039c849a2686"} Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.494925 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h7bpp" event={"ID":"7df882c6-828a-4aac-8aa9-811102008952","Type":"ContainerStarted","Data":"22704b894a6936c8545cf27f710a115b5cf9c5d5092f3977d94ab8a65ac2ab36"} Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.517540 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=8.874853994 podStartE2EDuration="12.517515069s" podCreationTimestamp="2025-12-03 14:37:20 +0000 UTC" firstStartedPulling="2025-12-03 14:37:21.269936998 +0000 UTC m=+1448.258292215" lastFinishedPulling="2025-12-03 14:37:24.912598073 +0000 UTC m=+1451.900953290" observedRunningTime="2025-12-03 14:37:32.50585879 +0000 UTC m=+1459.494213997" watchObservedRunningTime="2025-12-03 14:37:32.517515069 +0000 UTC m=+1459.505870286" Dec 03 14:37:32 crc kubenswrapper[4751]: I1203 14:37:32.547047 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-h7bpp" podStartSLOduration=3.09633983 podStartE2EDuration="18.547028822s" podCreationTimestamp="2025-12-03 14:37:14 +0000 UTC" firstStartedPulling="2025-12-03 14:37:15.88867089 +0000 UTC m=+1442.877026107" lastFinishedPulling="2025-12-03 14:37:31.339359882 +0000 UTC m=+1458.327715099" observedRunningTime="2025-12-03 14:37:32.539910643 +0000 UTC m=+1459.528265860" watchObservedRunningTime="2025-12-03 14:37:32.547028822 +0000 UTC m=+1459.535384039" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.526673 4751 generic.go:334] "Generic (PLEG): container finished" podID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerID="a884400999616e264f0d5bd2debc788723a0602f4baed37efb42d9f33c14d9f7" exitCode=0 Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.527213 4751 generic.go:334] "Generic (PLEG): container finished" podID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerID="5111d494711d3fc952353cff34682b84c22433e9c6b8f15b6005aa2dbc34034f" exitCode=2 Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.527224 4751 generic.go:334] "Generic (PLEG): container finished" podID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerID="1d1cec24e3ad2c6a8d6216ff4797836ae68fa9b1a682e26ea519c0e7e65b3652" exitCode=0 Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.527233 4751 generic.go:334] "Generic (PLEG): container finished" podID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerID="1ba7c83cc6fcae2a6de327eb53ffa20bbea6d19c1be4e78d949dc8a5d4d21393" exitCode=0 Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.527148 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerDied","Data":"a884400999616e264f0d5bd2debc788723a0602f4baed37efb42d9f33c14d9f7"} Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.527305 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerDied","Data":"5111d494711d3fc952353cff34682b84c22433e9c6b8f15b6005aa2dbc34034f"} Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.527347 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerDied","Data":"1d1cec24e3ad2c6a8d6216ff4797836ae68fa9b1a682e26ea519c0e7e65b3652"} Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.527362 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerDied","Data":"1ba7c83cc6fcae2a6de327eb53ffa20bbea6d19c1be4e78d949dc8a5d4d21393"} Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.527375 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68ab5def-2bae-4fe3-b44b-d0c4aadca765","Type":"ContainerDied","Data":"0ceb011c1d8c58348a392bb58639b611ebc014d6af3c032f279daa311f0b96a8"} Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.527386 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ceb011c1d8c58348a392bb58639b611ebc014d6af3c032f279daa311f0b96a8" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.530908 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7p5" event={"ID":"229c80e6-2c05-4430-9093-420b6b9aa241","Type":"ContainerStarted","Data":"13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66"} Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.596219 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.707940 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-scripts\") pod \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.708017 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-config-data\") pod \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.708094 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-run-httpd\") pod \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.708194 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-combined-ca-bundle\") pod \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.708251 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-sg-core-conf-yaml\") pod \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.708345 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fts8z\" (UniqueName: \"kubernetes.io/projected/68ab5def-2bae-4fe3-b44b-d0c4aadca765-kube-api-access-fts8z\") pod \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.708373 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-log-httpd\") pod \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\" (UID: \"68ab5def-2bae-4fe3-b44b-d0c4aadca765\") " Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.708449 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "68ab5def-2bae-4fe3-b44b-d0c4aadca765" (UID: "68ab5def-2bae-4fe3-b44b-d0c4aadca765"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.708916 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.709044 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "68ab5def-2bae-4fe3-b44b-d0c4aadca765" (UID: "68ab5def-2bae-4fe3-b44b-d0c4aadca765"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.719579 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68ab5def-2bae-4fe3-b44b-d0c4aadca765-kube-api-access-fts8z" (OuterVolumeSpecName: "kube-api-access-fts8z") pod "68ab5def-2bae-4fe3-b44b-d0c4aadca765" (UID: "68ab5def-2bae-4fe3-b44b-d0c4aadca765"). InnerVolumeSpecName "kube-api-access-fts8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.733534 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-scripts" (OuterVolumeSpecName: "scripts") pod "68ab5def-2bae-4fe3-b44b-d0c4aadca765" (UID: "68ab5def-2bae-4fe3-b44b-d0c4aadca765"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.811484 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fts8z\" (UniqueName: \"kubernetes.io/projected/68ab5def-2bae-4fe3-b44b-d0c4aadca765-kube-api-access-fts8z\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.812022 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68ab5def-2bae-4fe3-b44b-d0c4aadca765-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.812037 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.830222 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "68ab5def-2bae-4fe3-b44b-d0c4aadca765" (UID: "68ab5def-2bae-4fe3-b44b-d0c4aadca765"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.831473 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68ab5def-2bae-4fe3-b44b-d0c4aadca765" (UID: "68ab5def-2bae-4fe3-b44b-d0c4aadca765"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.873369 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-config-data" (OuterVolumeSpecName: "config-data") pod "68ab5def-2bae-4fe3-b44b-d0c4aadca765" (UID: "68ab5def-2bae-4fe3-b44b-d0c4aadca765"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.913700 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.913748 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:33 crc kubenswrapper[4751]: I1203 14:37:33.913758 4751 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68ab5def-2bae-4fe3-b44b-d0c4aadca765-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.529791 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.529884 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.540019 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.620096 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.621168 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.631520 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.650636 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:34 crc kubenswrapper[4751]: E1203 14:37:34.651209 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="ceilometer-notification-agent" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.651255 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="ceilometer-notification-agent" Dec 03 14:37:34 crc kubenswrapper[4751]: E1203 14:37:34.651292 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="ceilometer-central-agent" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.651300 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="ceilometer-central-agent" Dec 03 14:37:34 crc kubenswrapper[4751]: E1203 14:37:34.651336 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="sg-core" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.651343 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="sg-core" Dec 03 14:37:34 crc kubenswrapper[4751]: E1203 14:37:34.651376 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="proxy-httpd" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.651383 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="proxy-httpd" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.651714 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="ceilometer-notification-agent" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.651735 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="sg-core" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.651746 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="proxy-httpd" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.651759 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" containerName="ceilometer-central-agent" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.654099 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.663682 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.664030 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.705681 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.731290 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-config-data\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.731348 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-run-httpd\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.731591 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-log-httpd\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.731776 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.731839 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.731912 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-scripts\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.731983 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnwhn\" (UniqueName: \"kubernetes.io/projected/940ea3c8-16d0-42ff-85a8-ec71601553bd-kube-api-access-vnwhn\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.834049 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-scripts\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.834470 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnwhn\" (UniqueName: \"kubernetes.io/projected/940ea3c8-16d0-42ff-85a8-ec71601553bd-kube-api-access-vnwhn\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.835041 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-config-data\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.835095 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-run-httpd\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.835226 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-log-httpd\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.835384 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.835444 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.835798 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-run-httpd\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.836082 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-log-httpd\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.846759 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-scripts\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.851579 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-config-data\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.853590 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.854830 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnwhn\" (UniqueName: \"kubernetes.io/projected/940ea3c8-16d0-42ff-85a8-ec71601553bd-kube-api-access-vnwhn\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.866991 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " pod="openstack/ceilometer-0" Dec 03 14:37:34 crc kubenswrapper[4751]: I1203 14:37:34.979498 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:35 crc kubenswrapper[4751]: I1203 14:37:35.332141 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68ab5def-2bae-4fe3-b44b-d0c4aadca765" path="/var/lib/kubelet/pods/68ab5def-2bae-4fe3-b44b-d0c4aadca765/volumes" Dec 03 14:37:35 crc kubenswrapper[4751]: I1203 14:37:35.820046 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:37:35 crc kubenswrapper[4751]: I1203 14:37:35.821004 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:37:36 crc kubenswrapper[4751]: I1203 14:37:36.784866 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:36 crc kubenswrapper[4751]: W1203 14:37:36.791956 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod940ea3c8_16d0_42ff_85a8_ec71601553bd.slice/crio-0ba19e4cbbebc7fdba56ba6fda2bb69fe771319a98e54e20e457c9c99787521c WatchSource:0}: Error finding container 0ba19e4cbbebc7fdba56ba6fda2bb69fe771319a98e54e20e457c9c99787521c: Status 404 returned error can't find the container with id 0ba19e4cbbebc7fdba56ba6fda2bb69fe771319a98e54e20e457c9c99787521c Dec 03 14:37:37 crc kubenswrapper[4751]: I1203 14:37:37.567443 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerStarted","Data":"0ba19e4cbbebc7fdba56ba6fda2bb69fe771319a98e54e20e457c9c99787521c"} Dec 03 14:37:37 crc kubenswrapper[4751]: I1203 14:37:37.570680 4751 generic.go:334] "Generic (PLEG): container finished" podID="229c80e6-2c05-4430-9093-420b6b9aa241" containerID="13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66" exitCode=0 Dec 03 14:37:37 crc kubenswrapper[4751]: I1203 14:37:37.570773 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7p5" event={"ID":"229c80e6-2c05-4430-9093-420b6b9aa241","Type":"ContainerDied","Data":"13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66"} Dec 03 14:37:38 crc kubenswrapper[4751]: I1203 14:37:38.585141 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7p5" event={"ID":"229c80e6-2c05-4430-9093-420b6b9aa241","Type":"ContainerStarted","Data":"1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59"} Dec 03 14:37:38 crc kubenswrapper[4751]: I1203 14:37:38.587169 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerStarted","Data":"350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46"} Dec 03 14:37:38 crc kubenswrapper[4751]: I1203 14:37:38.610989 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ll7p5" podStartSLOduration=7.745840496 podStartE2EDuration="13.610970368s" podCreationTimestamp="2025-12-03 14:37:25 +0000 UTC" firstStartedPulling="2025-12-03 14:37:32.491808678 +0000 UTC m=+1459.480163895" lastFinishedPulling="2025-12-03 14:37:38.35693853 +0000 UTC m=+1465.345293767" observedRunningTime="2025-12-03 14:37:38.603958232 +0000 UTC m=+1465.592313459" watchObservedRunningTime="2025-12-03 14:37:38.610970368 +0000 UTC m=+1465.599325585" Dec 03 14:37:39 crc kubenswrapper[4751]: I1203 14:37:39.600679 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerStarted","Data":"1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf"} Dec 03 14:37:40 crc kubenswrapper[4751]: I1203 14:37:40.613610 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerStarted","Data":"3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c"} Dec 03 14:37:41 crc kubenswrapper[4751]: I1203 14:37:41.627265 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerStarted","Data":"46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184"} Dec 03 14:37:41 crc kubenswrapper[4751]: I1203 14:37:41.627641 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:37:41 crc kubenswrapper[4751]: I1203 14:37:41.679719 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.614838797 podStartE2EDuration="7.67969064s" podCreationTimestamp="2025-12-03 14:37:34 +0000 UTC" firstStartedPulling="2025-12-03 14:37:36.794399216 +0000 UTC m=+1463.782754433" lastFinishedPulling="2025-12-03 14:37:40.859251059 +0000 UTC m=+1467.847606276" observedRunningTime="2025-12-03 14:37:41.64728579 +0000 UTC m=+1468.635641007" watchObservedRunningTime="2025-12-03 14:37:41.67969064 +0000 UTC m=+1468.668045867" Dec 03 14:37:45 crc kubenswrapper[4751]: I1203 14:37:45.523403 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:45 crc kubenswrapper[4751]: I1203 14:37:45.524286 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:46 crc kubenswrapper[4751]: I1203 14:37:46.577712 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ll7p5" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" containerName="registry-server" probeResult="failure" output=< Dec 03 14:37:46 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 14:37:46 crc kubenswrapper[4751]: > Dec 03 14:37:48 crc kubenswrapper[4751]: I1203 14:37:48.717743 4751 generic.go:334] "Generic (PLEG): container finished" podID="7df882c6-828a-4aac-8aa9-811102008952" containerID="22704b894a6936c8545cf27f710a115b5cf9c5d5092f3977d94ab8a65ac2ab36" exitCode=0 Dec 03 14:37:48 crc kubenswrapper[4751]: I1203 14:37:48.717808 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h7bpp" event={"ID":"7df882c6-828a-4aac-8aa9-811102008952","Type":"ContainerDied","Data":"22704b894a6936c8545cf27f710a115b5cf9c5d5092f3977d94ab8a65ac2ab36"} Dec 03 14:37:49 crc kubenswrapper[4751]: I1203 14:37:49.410009 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:49 crc kubenswrapper[4751]: I1203 14:37:49.410290 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="ceilometer-central-agent" containerID="cri-o://350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46" gracePeriod=30 Dec 03 14:37:49 crc kubenswrapper[4751]: I1203 14:37:49.410312 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="proxy-httpd" containerID="cri-o://46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184" gracePeriod=30 Dec 03 14:37:49 crc kubenswrapper[4751]: I1203 14:37:49.410446 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="ceilometer-notification-agent" containerID="cri-o://1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf" gracePeriod=30 Dec 03 14:37:49 crc kubenswrapper[4751]: I1203 14:37:49.410438 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="sg-core" containerID="cri-o://3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c" gracePeriod=30 Dec 03 14:37:49 crc kubenswrapper[4751]: I1203 14:37:49.765138 4751 generic.go:334] "Generic (PLEG): container finished" podID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerID="46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184" exitCode=0 Dec 03 14:37:49 crc kubenswrapper[4751]: I1203 14:37:49.765457 4751 generic.go:334] "Generic (PLEG): container finished" podID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerID="3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c" exitCode=2 Dec 03 14:37:49 crc kubenswrapper[4751]: I1203 14:37:49.765253 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerDied","Data":"46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184"} Dec 03 14:37:49 crc kubenswrapper[4751]: I1203 14:37:49.765680 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerDied","Data":"3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c"} Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.191487 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.360707 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.364884 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-config-data\") pod \"7df882c6-828a-4aac-8aa9-811102008952\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.365058 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-combined-ca-bundle\") pod \"7df882c6-828a-4aac-8aa9-811102008952\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.365155 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-scripts\") pod \"7df882c6-828a-4aac-8aa9-811102008952\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.365290 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbsmh\" (UniqueName: \"kubernetes.io/projected/7df882c6-828a-4aac-8aa9-811102008952-kube-api-access-lbsmh\") pod \"7df882c6-828a-4aac-8aa9-811102008952\" (UID: \"7df882c6-828a-4aac-8aa9-811102008952\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.371190 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-scripts" (OuterVolumeSpecName: "scripts") pod "7df882c6-828a-4aac-8aa9-811102008952" (UID: "7df882c6-828a-4aac-8aa9-811102008952"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.371660 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df882c6-828a-4aac-8aa9-811102008952-kube-api-access-lbsmh" (OuterVolumeSpecName: "kube-api-access-lbsmh") pod "7df882c6-828a-4aac-8aa9-811102008952" (UID: "7df882c6-828a-4aac-8aa9-811102008952"). InnerVolumeSpecName "kube-api-access-lbsmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.414917 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7df882c6-828a-4aac-8aa9-811102008952" (UID: "7df882c6-828a-4aac-8aa9-811102008952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.415575 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-config-data" (OuterVolumeSpecName: "config-data") pod "7df882c6-828a-4aac-8aa9-811102008952" (UID: "7df882c6-828a-4aac-8aa9-811102008952"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.466793 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnwhn\" (UniqueName: \"kubernetes.io/projected/940ea3c8-16d0-42ff-85a8-ec71601553bd-kube-api-access-vnwhn\") pod \"940ea3c8-16d0-42ff-85a8-ec71601553bd\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.466866 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-log-httpd\") pod \"940ea3c8-16d0-42ff-85a8-ec71601553bd\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.466980 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-sg-core-conf-yaml\") pod \"940ea3c8-16d0-42ff-85a8-ec71601553bd\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.467038 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-run-httpd\") pod \"940ea3c8-16d0-42ff-85a8-ec71601553bd\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.467147 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-scripts\") pod \"940ea3c8-16d0-42ff-85a8-ec71601553bd\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.467203 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-config-data\") pod \"940ea3c8-16d0-42ff-85a8-ec71601553bd\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.467235 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-combined-ca-bundle\") pod \"940ea3c8-16d0-42ff-85a8-ec71601553bd\" (UID: \"940ea3c8-16d0-42ff-85a8-ec71601553bd\") " Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.467771 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.467790 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbsmh\" (UniqueName: \"kubernetes.io/projected/7df882c6-828a-4aac-8aa9-811102008952-kube-api-access-lbsmh\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.467800 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.467808 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df882c6-828a-4aac-8aa9-811102008952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.468930 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "940ea3c8-16d0-42ff-85a8-ec71601553bd" (UID: "940ea3c8-16d0-42ff-85a8-ec71601553bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.469502 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "940ea3c8-16d0-42ff-85a8-ec71601553bd" (UID: "940ea3c8-16d0-42ff-85a8-ec71601553bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.472013 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-scripts" (OuterVolumeSpecName: "scripts") pod "940ea3c8-16d0-42ff-85a8-ec71601553bd" (UID: "940ea3c8-16d0-42ff-85a8-ec71601553bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.472130 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940ea3c8-16d0-42ff-85a8-ec71601553bd-kube-api-access-vnwhn" (OuterVolumeSpecName: "kube-api-access-vnwhn") pod "940ea3c8-16d0-42ff-85a8-ec71601553bd" (UID: "940ea3c8-16d0-42ff-85a8-ec71601553bd"). InnerVolumeSpecName "kube-api-access-vnwhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.521957 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "940ea3c8-16d0-42ff-85a8-ec71601553bd" (UID: "940ea3c8-16d0-42ff-85a8-ec71601553bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.562101 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "940ea3c8-16d0-42ff-85a8-ec71601553bd" (UID: "940ea3c8-16d0-42ff-85a8-ec71601553bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.569740 4751 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.569775 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.569788 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.569798 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.569810 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnwhn\" (UniqueName: \"kubernetes.io/projected/940ea3c8-16d0-42ff-85a8-ec71601553bd-kube-api-access-vnwhn\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.569824 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940ea3c8-16d0-42ff-85a8-ec71601553bd-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.579935 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-config-data" (OuterVolumeSpecName: "config-data") pod "940ea3c8-16d0-42ff-85a8-ec71601553bd" (UID: "940ea3c8-16d0-42ff-85a8-ec71601553bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.671898 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940ea3c8-16d0-42ff-85a8-ec71601553bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.781739 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h7bpp" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.781754 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h7bpp" event={"ID":"7df882c6-828a-4aac-8aa9-811102008952","Type":"ContainerDied","Data":"66118c92fd60e0801247e34a9d3db07c662ac238c010d7c69d32041e588bd180"} Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.782485 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66118c92fd60e0801247e34a9d3db07c662ac238c010d7c69d32041e588bd180" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.785577 4751 generic.go:334] "Generic (PLEG): container finished" podID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerID="1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf" exitCode=0 Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.785607 4751 generic.go:334] "Generic (PLEG): container finished" podID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerID="350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46" exitCode=0 Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.785630 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerDied","Data":"1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf"} Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.785663 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerDied","Data":"350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46"} Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.785675 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940ea3c8-16d0-42ff-85a8-ec71601553bd","Type":"ContainerDied","Data":"0ba19e4cbbebc7fdba56ba6fda2bb69fe771319a98e54e20e457c9c99787521c"} Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.785669 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.785726 4751 scope.go:117] "RemoveContainer" containerID="46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.808114 4751 scope.go:117] "RemoveContainer" containerID="3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.835924 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.836023 4751 scope.go:117] "RemoveContainer" containerID="1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.852704 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.863635 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:50 crc kubenswrapper[4751]: E1203 14:37:50.864121 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="ceilometer-notification-agent" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864137 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="ceilometer-notification-agent" Dec 03 14:37:50 crc kubenswrapper[4751]: E1203 14:37:50.864146 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="proxy-httpd" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864154 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="proxy-httpd" Dec 03 14:37:50 crc kubenswrapper[4751]: E1203 14:37:50.864191 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="sg-core" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864199 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="sg-core" Dec 03 14:37:50 crc kubenswrapper[4751]: E1203 14:37:50.864237 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df882c6-828a-4aac-8aa9-811102008952" containerName="nova-cell0-conductor-db-sync" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864243 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df882c6-828a-4aac-8aa9-811102008952" containerName="nova-cell0-conductor-db-sync" Dec 03 14:37:50 crc kubenswrapper[4751]: E1203 14:37:50.864258 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="ceilometer-central-agent" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864265 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="ceilometer-central-agent" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864460 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="sg-core" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864481 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="ceilometer-central-agent" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864493 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="ceilometer-notification-agent" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864506 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df882c6-828a-4aac-8aa9-811102008952" containerName="nova-cell0-conductor-db-sync" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.864516 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" containerName="proxy-httpd" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.866588 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.870503 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.870653 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.879264 4751 scope.go:117] "RemoveContainer" containerID="350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.895258 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.897023 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.900970 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-28c95" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.901251 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.909419 4751 scope.go:117] "RemoveContainer" containerID="46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184" Dec 03 14:37:50 crc kubenswrapper[4751]: E1203 14:37:50.910019 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184\": container with ID starting with 46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184 not found: ID does not exist" containerID="46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.910056 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184"} err="failed to get container status \"46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184\": rpc error: code = NotFound desc = could not find container \"46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184\": container with ID starting with 46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184 not found: ID does not exist" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.910082 4751 scope.go:117] "RemoveContainer" containerID="3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c" Dec 03 14:37:50 crc kubenswrapper[4751]: E1203 14:37:50.910614 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c\": container with ID starting with 3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c not found: ID does not exist" containerID="3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.910651 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c"} err="failed to get container status \"3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c\": rpc error: code = NotFound desc = could not find container \"3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c\": container with ID starting with 3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c not found: ID does not exist" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.910672 4751 scope.go:117] "RemoveContainer" containerID="1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf" Dec 03 14:37:50 crc kubenswrapper[4751]: E1203 14:37:50.910951 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf\": container with ID starting with 1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf not found: ID does not exist" containerID="1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.910982 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf"} err="failed to get container status \"1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf\": rpc error: code = NotFound desc = could not find container \"1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf\": container with ID starting with 1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf not found: ID does not exist" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.910998 4751 scope.go:117] "RemoveContainer" containerID="350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.911272 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 14:37:50 crc kubenswrapper[4751]: E1203 14:37:50.912511 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46\": container with ID starting with 350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46 not found: ID does not exist" containerID="350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.912550 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46"} err="failed to get container status \"350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46\": rpc error: code = NotFound desc = could not find container \"350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46\": container with ID starting with 350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46 not found: ID does not exist" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.912573 4751 scope.go:117] "RemoveContainer" containerID="46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.913054 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184"} err="failed to get container status \"46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184\": rpc error: code = NotFound desc = could not find container \"46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184\": container with ID starting with 46498fc88431327bd7ba26b77001c22c733e3cd66724afea08a8f4d2bbfdb184 not found: ID does not exist" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.913106 4751 scope.go:117] "RemoveContainer" containerID="3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.913403 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c"} err="failed to get container status \"3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c\": rpc error: code = NotFound desc = could not find container \"3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c\": container with ID starting with 3e77b278793a320df7446e9be7253f3f5545e7e2524a20744d95747ad0edab5c not found: ID does not exist" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.913448 4751 scope.go:117] "RemoveContainer" containerID="1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.913705 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf"} err="failed to get container status \"1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf\": rpc error: code = NotFound desc = could not find container \"1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf\": container with ID starting with 1754a241b6d04f5ccf0cd2d25a252ac9f2894eac5924c1322050213369e4fecf not found: ID does not exist" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.913728 4751 scope.go:117] "RemoveContainer" containerID="350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.913913 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46"} err="failed to get container status \"350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46\": rpc error: code = NotFound desc = could not find container \"350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46\": container with ID starting with 350dff996aed65de6f00b9f96191848ceb1ad334b3ba55adb3a36b8c3ec5ad46 not found: ID does not exist" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.925137 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.977722 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5cc\" (UniqueName: \"kubernetes.io/projected/a9f00240-53f7-417a-9403-112cb396c30f-kube-api-access-lj5cc\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.977777 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-run-httpd\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.977866 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-log-httpd\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.977913 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-scripts\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.977954 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.978011 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:50 crc kubenswrapper[4751]: I1203 14:37:50.978055 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-config-data\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.079823 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-run-httpd\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.079929 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-log-httpd\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.079963 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac48c68-2c8f-47ff-8f11-7974913dbac1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bac48c68-2c8f-47ff-8f11-7974913dbac1\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.079989 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-scripts\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.080025 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.080043 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac48c68-2c8f-47ff-8f11-7974913dbac1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bac48c68-2c8f-47ff-8f11-7974913dbac1\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.080088 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klhlf\" (UniqueName: \"kubernetes.io/projected/bac48c68-2c8f-47ff-8f11-7974913dbac1-kube-api-access-klhlf\") pod \"nova-cell0-conductor-0\" (UID: \"bac48c68-2c8f-47ff-8f11-7974913dbac1\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.080117 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.080178 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-config-data\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.080213 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5cc\" (UniqueName: \"kubernetes.io/projected/a9f00240-53f7-417a-9403-112cb396c30f-kube-api-access-lj5cc\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.080456 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-run-httpd\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.080564 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-log-httpd\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.085849 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-scripts\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.086284 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-config-data\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.086842 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.087264 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.102038 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5cc\" (UniqueName: \"kubernetes.io/projected/a9f00240-53f7-417a-9403-112cb396c30f-kube-api-access-lj5cc\") pod \"ceilometer-0\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.181742 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac48c68-2c8f-47ff-8f11-7974913dbac1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bac48c68-2c8f-47ff-8f11-7974913dbac1\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.182184 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac48c68-2c8f-47ff-8f11-7974913dbac1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bac48c68-2c8f-47ff-8f11-7974913dbac1\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.182810 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klhlf\" (UniqueName: \"kubernetes.io/projected/bac48c68-2c8f-47ff-8f11-7974913dbac1-kube-api-access-klhlf\") pod \"nova-cell0-conductor-0\" (UID: \"bac48c68-2c8f-47ff-8f11-7974913dbac1\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.188051 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac48c68-2c8f-47ff-8f11-7974913dbac1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bac48c68-2c8f-47ff-8f11-7974913dbac1\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.188059 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac48c68-2c8f-47ff-8f11-7974913dbac1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bac48c68-2c8f-47ff-8f11-7974913dbac1\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.197529 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.198523 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klhlf\" (UniqueName: \"kubernetes.io/projected/bac48c68-2c8f-47ff-8f11-7974913dbac1-kube-api-access-klhlf\") pod \"nova-cell0-conductor-0\" (UID: \"bac48c68-2c8f-47ff-8f11-7974913dbac1\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.225838 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.328220 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940ea3c8-16d0-42ff-85a8-ec71601553bd" path="/var/lib/kubelet/pods/940ea3c8-16d0-42ff-85a8-ec71601553bd/volumes" Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.764891 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.801856 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerStarted","Data":"c9e74ff95a2e2aebc531245f00b55d46376011c5dbfc60a40394eb657fc82328"} Dec 03 14:37:51 crc kubenswrapper[4751]: I1203 14:37:51.866514 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 14:37:52 crc kubenswrapper[4751]: I1203 14:37:52.816346 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bac48c68-2c8f-47ff-8f11-7974913dbac1","Type":"ContainerStarted","Data":"78a2455249ea725169e38603e573a3431cc15cb5e9a5a5e107aec8a2c65cff8d"} Dec 03 14:37:52 crc kubenswrapper[4751]: I1203 14:37:52.817065 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bac48c68-2c8f-47ff-8f11-7974913dbac1","Type":"ContainerStarted","Data":"a65c8569c6310a3e32896a2dde08cfd15c564cc83cec6bf97e189d4d075b7c70"} Dec 03 14:37:52 crc kubenswrapper[4751]: I1203 14:37:52.820996 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 14:37:52 crc kubenswrapper[4751]: I1203 14:37:52.876195 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.876167266 podStartE2EDuration="2.876167266s" podCreationTimestamp="2025-12-03 14:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:37:52.860162362 +0000 UTC m=+1479.848517579" watchObservedRunningTime="2025-12-03 14:37:52.876167266 +0000 UTC m=+1479.864522483" Dec 03 14:37:53 crc kubenswrapper[4751]: I1203 14:37:53.828949 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerStarted","Data":"f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562"} Dec 03 14:37:53 crc kubenswrapper[4751]: I1203 14:37:53.829296 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerStarted","Data":"59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290"} Dec 03 14:37:54 crc kubenswrapper[4751]: I1203 14:37:54.883060 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerStarted","Data":"e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2"} Dec 03 14:37:55 crc kubenswrapper[4751]: I1203 14:37:55.601740 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:55 crc kubenswrapper[4751]: I1203 14:37:55.689880 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:55 crc kubenswrapper[4751]: I1203 14:37:55.846565 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ll7p5"] Dec 03 14:37:55 crc kubenswrapper[4751]: I1203 14:37:55.890962 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerStarted","Data":"e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40"} Dec 03 14:37:55 crc kubenswrapper[4751]: I1203 14:37:55.930065 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.203737161 podStartE2EDuration="5.930039164s" podCreationTimestamp="2025-12-03 14:37:50 +0000 UTC" firstStartedPulling="2025-12-03 14:37:51.763967147 +0000 UTC m=+1478.752322354" lastFinishedPulling="2025-12-03 14:37:55.49026915 +0000 UTC m=+1482.478624357" observedRunningTime="2025-12-03 14:37:55.917009419 +0000 UTC m=+1482.905364646" watchObservedRunningTime="2025-12-03 14:37:55.930039164 +0000 UTC m=+1482.918394381" Dec 03 14:37:56 crc kubenswrapper[4751]: I1203 14:37:56.901961 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ll7p5" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" containerName="registry-server" containerID="cri-o://1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59" gracePeriod=2 Dec 03 14:37:56 crc kubenswrapper[4751]: I1203 14:37:56.902381 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.477890 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.529151 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqgpv\" (UniqueName: \"kubernetes.io/projected/229c80e6-2c05-4430-9093-420b6b9aa241-kube-api-access-tqgpv\") pod \"229c80e6-2c05-4430-9093-420b6b9aa241\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.529239 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-catalog-content\") pod \"229c80e6-2c05-4430-9093-420b6b9aa241\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.529485 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-utilities\") pod \"229c80e6-2c05-4430-9093-420b6b9aa241\" (UID: \"229c80e6-2c05-4430-9093-420b6b9aa241\") " Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.530615 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-utilities" (OuterVolumeSpecName: "utilities") pod "229c80e6-2c05-4430-9093-420b6b9aa241" (UID: "229c80e6-2c05-4430-9093-420b6b9aa241"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.537090 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/229c80e6-2c05-4430-9093-420b6b9aa241-kube-api-access-tqgpv" (OuterVolumeSpecName: "kube-api-access-tqgpv") pod "229c80e6-2c05-4430-9093-420b6b9aa241" (UID: "229c80e6-2c05-4430-9093-420b6b9aa241"). InnerVolumeSpecName "kube-api-access-tqgpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.631864 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.631904 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqgpv\" (UniqueName: \"kubernetes.io/projected/229c80e6-2c05-4430-9093-420b6b9aa241-kube-api-access-tqgpv\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.634898 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "229c80e6-2c05-4430-9093-420b6b9aa241" (UID: "229c80e6-2c05-4430-9093-420b6b9aa241"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.734438 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/229c80e6-2c05-4430-9093-420b6b9aa241-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.915792 4751 generic.go:334] "Generic (PLEG): container finished" podID="229c80e6-2c05-4430-9093-420b6b9aa241" containerID="1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59" exitCode=0 Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.915874 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll7p5" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.915874 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7p5" event={"ID":"229c80e6-2c05-4430-9093-420b6b9aa241","Type":"ContainerDied","Data":"1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59"} Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.916116 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7p5" event={"ID":"229c80e6-2c05-4430-9093-420b6b9aa241","Type":"ContainerDied","Data":"8596fc79e8a1489a7cb7db331dce15565f463bf596b03ca08eb0039c849a2686"} Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.916208 4751 scope.go:117] "RemoveContainer" containerID="1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.939873 4751 scope.go:117] "RemoveContainer" containerID="13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66" Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.962698 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ll7p5"] Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.973899 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ll7p5"] Dec 03 14:37:57 crc kubenswrapper[4751]: I1203 14:37:57.977451 4751 scope.go:117] "RemoveContainer" containerID="ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6" Dec 03 14:37:58 crc kubenswrapper[4751]: I1203 14:37:58.036913 4751 scope.go:117] "RemoveContainer" containerID="1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59" Dec 03 14:37:58 crc kubenswrapper[4751]: E1203 14:37:58.038211 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59\": container with ID starting with 1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59 not found: ID does not exist" containerID="1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59" Dec 03 14:37:58 crc kubenswrapper[4751]: I1203 14:37:58.038245 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59"} err="failed to get container status \"1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59\": rpc error: code = NotFound desc = could not find container \"1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59\": container with ID starting with 1150f50cd2179fda606f9cfb6ea0270aefcbf0aec879407916f2da7bcee17d59 not found: ID does not exist" Dec 03 14:37:58 crc kubenswrapper[4751]: I1203 14:37:58.038269 4751 scope.go:117] "RemoveContainer" containerID="13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66" Dec 03 14:37:58 crc kubenswrapper[4751]: E1203 14:37:58.038552 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66\": container with ID starting with 13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66 not found: ID does not exist" containerID="13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66" Dec 03 14:37:58 crc kubenswrapper[4751]: I1203 14:37:58.038578 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66"} err="failed to get container status \"13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66\": rpc error: code = NotFound desc = could not find container \"13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66\": container with ID starting with 13b4c5af6b7843b0f8b0f66f05a34e193bb1f00984b4894606c2b7c03aaaaf66 not found: ID does not exist" Dec 03 14:37:58 crc kubenswrapper[4751]: I1203 14:37:58.038594 4751 scope.go:117] "RemoveContainer" containerID="ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6" Dec 03 14:37:58 crc kubenswrapper[4751]: E1203 14:37:58.038843 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6\": container with ID starting with ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6 not found: ID does not exist" containerID="ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6" Dec 03 14:37:58 crc kubenswrapper[4751]: I1203 14:37:58.038864 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6"} err="failed to get container status \"ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6\": rpc error: code = NotFound desc = could not find container \"ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6\": container with ID starting with ea787abbf1b2856a2fa2fd57a7eb2c8c7e8b9189389de451af50f671597be3d6 not found: ID does not exist" Dec 03 14:37:59 crc kubenswrapper[4751]: I1203 14:37:59.325531 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" path="/var/lib/kubelet/pods/229c80e6-2c05-4430-9093-420b6b9aa241/volumes" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.253880 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.789492 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ccbpw"] Dec 03 14:38:01 crc kubenswrapper[4751]: E1203 14:38:01.790012 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" containerName="registry-server" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.790039 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" containerName="registry-server" Dec 03 14:38:01 crc kubenswrapper[4751]: E1203 14:38:01.790058 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" containerName="extract-utilities" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.790068 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" containerName="extract-utilities" Dec 03 14:38:01 crc kubenswrapper[4751]: E1203 14:38:01.790087 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" containerName="extract-content" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.790095 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" containerName="extract-content" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.790400 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="229c80e6-2c05-4430-9093-420b6b9aa241" containerName="registry-server" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.791368 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.795737 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.796499 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.807927 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ccbpw"] Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.848522 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-config-data\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.848603 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-scripts\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.848654 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.848953 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2c4\" (UniqueName: \"kubernetes.io/projected/57bcc78c-1540-47ee-82f6-664aff4f6216-kube-api-access-xg2c4\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.955733 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg2c4\" (UniqueName: \"kubernetes.io/projected/57bcc78c-1540-47ee-82f6-664aff4f6216-kube-api-access-xg2c4\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.955995 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-config-data\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.956085 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-scripts\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.956162 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.971519 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-config-data\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.971538 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:01 crc kubenswrapper[4751]: I1203 14:38:01.972120 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-scripts\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.005245 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg2c4\" (UniqueName: \"kubernetes.io/projected/57bcc78c-1540-47ee-82f6-664aff4f6216-kube-api-access-xg2c4\") pod \"nova-cell0-cell-mapping-ccbpw\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.012774 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.015272 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.022304 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.025208 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.117480 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.144554 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.145970 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.152819 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.165269 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.165393 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-config-data\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.165431 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v4lg\" (UniqueName: \"kubernetes.io/projected/14a4ea42-438e-40ad-b3fa-71db5808ff98-kube-api-access-2v4lg\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.165489 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a4ea42-438e-40ad-b3fa-71db5808ff98-logs\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.175059 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.202659 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.204610 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.207689 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.243294 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.271751 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-config-data\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.271838 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v4lg\" (UniqueName: \"kubernetes.io/projected/14a4ea42-438e-40ad-b3fa-71db5808ff98-kube-api-access-2v4lg\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.271953 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-config-data\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.272009 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a4ea42-438e-40ad-b3fa-71db5808ff98-logs\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.272029 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.272046 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lx6l\" (UniqueName: \"kubernetes.io/projected/c57ba59a-8492-439e-8383-89298aa3c6ed-kube-api-access-5lx6l\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.272115 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft7xk\" (UniqueName: \"kubernetes.io/projected/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-kube-api-access-ft7xk\") pod \"nova-scheduler-0\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.272145 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.272162 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-config-data\") pod \"nova-scheduler-0\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.272197 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.274031 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.275399 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.277746 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ba59a-8492-439e-8383-89298aa3c6ed-logs\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.279052 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a4ea42-438e-40ad-b3fa-71db5808ff98-logs\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.280656 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.283611 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.305557 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-config-data\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.322378 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.336316 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v4lg\" (UniqueName: \"kubernetes.io/projected/14a4ea42-438e-40ad-b3fa-71db5808ff98-kube-api-access-2v4lg\") pod \"nova-api-0\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.379573 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.379914 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-config-data\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.379968 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.379995 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lx6l\" (UniqueName: \"kubernetes.io/projected/c57ba59a-8492-439e-8383-89298aa3c6ed-kube-api-access-5lx6l\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.380059 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft7xk\" (UniqueName: \"kubernetes.io/projected/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-kube-api-access-ft7xk\") pod \"nova-scheduler-0\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.380087 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.380114 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-config-data\") pod \"nova-scheduler-0\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.380181 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ba59a-8492-439e-8383-89298aa3c6ed-logs\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.380289 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwlqx\" (UniqueName: \"kubernetes.io/projected/2cff73d8-de68-4c3c-9784-15f5ac91acae-kube-api-access-hwlqx\") pod \"nova-cell1-novncproxy-0\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.380349 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.381685 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ba59a-8492-439e-8383-89298aa3c6ed-logs\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.384459 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-config-data\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.386891 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.387974 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.397986 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-config-data\") pod \"nova-scheduler-0\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.405062 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lx6l\" (UniqueName: \"kubernetes.io/projected/c57ba59a-8492-439e-8383-89298aa3c6ed-kube-api-access-5lx6l\") pod \"nova-metadata-0\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.405582 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft7xk\" (UniqueName: \"kubernetes.io/projected/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-kube-api-access-ft7xk\") pod \"nova-scheduler-0\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.411304 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.445723 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-5szns"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.471843 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-5szns"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.471962 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.491637 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.491927 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwlqx\" (UniqueName: \"kubernetes.io/projected/2cff73d8-de68-4c3c-9784-15f5ac91acae-kube-api-access-hwlqx\") pod \"nova-cell1-novncproxy-0\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.491978 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.510442 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.522083 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwlqx\" (UniqueName: \"kubernetes.io/projected/2cff73d8-de68-4c3c-9784-15f5ac91acae-kube-api-access-hwlqx\") pod \"nova-cell1-novncproxy-0\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.537077 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.577932 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.598758 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-sb\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.598841 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-swift-storage-0\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.598947 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-config\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.599008 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-nb\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.599065 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqt4t\" (UniqueName: \"kubernetes.io/projected/267c2a24-af2e-48c4-9101-fbf9bba26e67-kube-api-access-zqt4t\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.599092 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-svc\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.657075 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.702132 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.703910 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-config\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.703977 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-nb\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.704022 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqt4t\" (UniqueName: \"kubernetes.io/projected/267c2a24-af2e-48c4-9101-fbf9bba26e67-kube-api-access-zqt4t\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.704042 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-svc\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.704114 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-sb\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.704157 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-swift-storage-0\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.705695 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-nb\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.705794 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-svc\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.706258 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-config\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.706886 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-sb\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.707350 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-swift-storage-0\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.753991 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqt4t\" (UniqueName: \"kubernetes.io/projected/267c2a24-af2e-48c4-9101-fbf9bba26e67-kube-api-access-zqt4t\") pod \"dnsmasq-dns-884c8b8f5-5szns\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.842935 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ccbpw"] Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.854352 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:02 crc kubenswrapper[4751]: I1203 14:38:02.986834 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ccbpw" event={"ID":"57bcc78c-1540-47ee-82f6-664aff4f6216","Type":"ContainerStarted","Data":"2e815fbb08a821b6c1eb56d0aec40a285817f6be3e6d05d48107740f171b9c94"} Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.033260 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:03 crc kubenswrapper[4751]: W1203 14:38:03.112073 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14a4ea42_438e_40ad_b3fa_71db5808ff98.slice/crio-ae2a1b3ae3412712ba16a8daf8111d6c65de92b252ea83a58595c2001d621b2d WatchSource:0}: Error finding container ae2a1b3ae3412712ba16a8daf8111d6c65de92b252ea83a58595c2001d621b2d: Status 404 returned error can't find the container with id ae2a1b3ae3412712ba16a8daf8111d6c65de92b252ea83a58595c2001d621b2d Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.138737 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lhljr"] Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.140393 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.149559 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.149688 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.155491 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lhljr"] Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.220776 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-config-data\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.220861 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwqrq\" (UniqueName: \"kubernetes.io/projected/528661c5-7b80-48d3-b8fd-7d20c23932f7-kube-api-access-nwqrq\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.220888 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-scripts\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.221517 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.259724 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.323213 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.323303 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-config-data\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.323392 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwqrq\" (UniqueName: \"kubernetes.io/projected/528661c5-7b80-48d3-b8fd-7d20c23932f7-kube-api-access-nwqrq\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.323431 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-scripts\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.332149 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.344885 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-scripts\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.358193 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-config-data\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.370257 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwqrq\" (UniqueName: \"kubernetes.io/projected/528661c5-7b80-48d3-b8fd-7d20c23932f7-kube-api-access-nwqrq\") pod \"nova-cell1-conductor-db-sync-lhljr\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.513014 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:38:03 crc kubenswrapper[4751]: W1203 14:38:03.537758 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc57ba59a_8492_439e_8383_89298aa3c6ed.slice/crio-7acf59737fd22a2cc5d98a1f7beb16c08599f77b00af1efa346b282ef9da99ad WatchSource:0}: Error finding container 7acf59737fd22a2cc5d98a1f7beb16c08599f77b00af1efa346b282ef9da99ad: Status 404 returned error can't find the container with id 7acf59737fd22a2cc5d98a1f7beb16c08599f77b00af1efa346b282ef9da99ad Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.537802 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.558840 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:03 crc kubenswrapper[4751]: I1203 14:38:03.770795 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-5szns"] Dec 03 14:38:04 crc kubenswrapper[4751]: I1203 14:38:04.009663 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2cff73d8-de68-4c3c-9784-15f5ac91acae","Type":"ContainerStarted","Data":"cec2e953087a184ce1c1099a530e60847655fb52a2cae5f8caebd10a4b253728"} Dec 03 14:38:04 crc kubenswrapper[4751]: I1203 14:38:04.015397 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b6ba00e8-4494-44a7-9042-8286d9a0d6b1","Type":"ContainerStarted","Data":"635eea8707f0a4ad6419b5c5fd556faa1b3f15de6a26202427370809f230d6c7"} Dec 03 14:38:04 crc kubenswrapper[4751]: I1203 14:38:04.022935 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ccbpw" event={"ID":"57bcc78c-1540-47ee-82f6-664aff4f6216","Type":"ContainerStarted","Data":"1f0e9aff22382cad550073805ca2c207dcafb0ce675cc850c7b00be3dd9b6a89"} Dec 03 14:38:04 crc kubenswrapper[4751]: I1203 14:38:04.057004 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c57ba59a-8492-439e-8383-89298aa3c6ed","Type":"ContainerStarted","Data":"7acf59737fd22a2cc5d98a1f7beb16c08599f77b00af1efa346b282ef9da99ad"} Dec 03 14:38:04 crc kubenswrapper[4751]: I1203 14:38:04.064477 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ccbpw" podStartSLOduration=3.064453555 podStartE2EDuration="3.064453555s" podCreationTimestamp="2025-12-03 14:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:04.048991855 +0000 UTC m=+1491.037347072" watchObservedRunningTime="2025-12-03 14:38:04.064453555 +0000 UTC m=+1491.052808772" Dec 03 14:38:04 crc kubenswrapper[4751]: I1203 14:38:04.067138 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" event={"ID":"267c2a24-af2e-48c4-9101-fbf9bba26e67","Type":"ContainerStarted","Data":"0ab0346052c754a8dbdec9fba3c53bd576e0b0035e2dc135e6b034f7beb35d92"} Dec 03 14:38:04 crc kubenswrapper[4751]: I1203 14:38:04.070203 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14a4ea42-438e-40ad-b3fa-71db5808ff98","Type":"ContainerStarted","Data":"ae2a1b3ae3412712ba16a8daf8111d6c65de92b252ea83a58595c2001d621b2d"} Dec 03 14:38:04 crc kubenswrapper[4751]: I1203 14:38:04.120293 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lhljr"] Dec 03 14:38:04 crc kubenswrapper[4751]: W1203 14:38:04.124182 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod528661c5_7b80_48d3_b8fd_7d20c23932f7.slice/crio-972159480fb5a96a65b66f2d0f274f39ecb1cd5ee2297c706fe71bbcedde13bb WatchSource:0}: Error finding container 972159480fb5a96a65b66f2d0f274f39ecb1cd5ee2297c706fe71bbcedde13bb: Status 404 returned error can't find the container with id 972159480fb5a96a65b66f2d0f274f39ecb1cd5ee2297c706fe71bbcedde13bb Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.087033 4751 generic.go:334] "Generic (PLEG): container finished" podID="267c2a24-af2e-48c4-9101-fbf9bba26e67" containerID="5254aef6d7a3c7ec10fed09dcced6a5e88f35fb97688379f002044ccb3d7cc69" exitCode=0 Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.087591 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" event={"ID":"267c2a24-af2e-48c4-9101-fbf9bba26e67","Type":"ContainerDied","Data":"5254aef6d7a3c7ec10fed09dcced6a5e88f35fb97688379f002044ccb3d7cc69"} Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.105575 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lhljr" event={"ID":"528661c5-7b80-48d3-b8fd-7d20c23932f7","Type":"ContainerStarted","Data":"74931be11a9b42eb7d9b58dba9ad8007cc1df42f646f017355650aac7bedb0ba"} Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.105618 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lhljr" event={"ID":"528661c5-7b80-48d3-b8fd-7d20c23932f7","Type":"ContainerStarted","Data":"972159480fb5a96a65b66f2d0f274f39ecb1cd5ee2297c706fe71bbcedde13bb"} Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.134119 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-lhljr" podStartSLOduration=2.134097915 podStartE2EDuration="2.134097915s" podCreationTimestamp="2025-12-03 14:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:05.126321729 +0000 UTC m=+1492.114676956" watchObservedRunningTime="2025-12-03 14:38:05.134097915 +0000 UTC m=+1492.122453132" Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.820233 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.820683 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.820846 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.822865 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8513ef227e39ef06a8d05cad17c9635fc3ec8cf5ec5acd20288a621754b77ca6"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:38:05 crc kubenswrapper[4751]: I1203 14:38:05.823023 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://8513ef227e39ef06a8d05cad17c9635fc3ec8cf5ec5acd20288a621754b77ca6" gracePeriod=600 Dec 03 14:38:06 crc kubenswrapper[4751]: I1203 14:38:06.000766 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:38:06 crc kubenswrapper[4751]: I1203 14:38:06.015312 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:06 crc kubenswrapper[4751]: I1203 14:38:06.127093 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="8513ef227e39ef06a8d05cad17c9635fc3ec8cf5ec5acd20288a621754b77ca6" exitCode=0 Dec 03 14:38:06 crc kubenswrapper[4751]: I1203 14:38:06.128824 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"8513ef227e39ef06a8d05cad17c9635fc3ec8cf5ec5acd20288a621754b77ca6"} Dec 03 14:38:06 crc kubenswrapper[4751]: I1203 14:38:06.128866 4751 scope.go:117] "RemoveContainer" containerID="013b499465da11b00f7b510304fcaff215703026384eae17787f3651933e4e4f" Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.178861 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2cff73d8-de68-4c3c-9784-15f5ac91acae","Type":"ContainerStarted","Data":"e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883"} Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.179709 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2cff73d8-de68-4c3c-9784-15f5ac91acae" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883" gracePeriod=30 Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.184718 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b6ba00e8-4494-44a7-9042-8286d9a0d6b1","Type":"ContainerStarted","Data":"f1150d0b4a2de00be8031fc5ac97d6bf2fd2427c1e564a0fcf8018d0f3bcc837"} Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.193247 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c57ba59a-8492-439e-8383-89298aa3c6ed","Type":"ContainerStarted","Data":"ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f"} Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.200959 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.344273188 podStartE2EDuration="7.200939991s" podCreationTimestamp="2025-12-03 14:38:02 +0000 UTC" firstStartedPulling="2025-12-03 14:38:03.50887619 +0000 UTC m=+1490.497231407" lastFinishedPulling="2025-12-03 14:38:08.365542993 +0000 UTC m=+1495.353898210" observedRunningTime="2025-12-03 14:38:09.196557465 +0000 UTC m=+1496.184912672" watchObservedRunningTime="2025-12-03 14:38:09.200939991 +0000 UTC m=+1496.189295208" Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.203108 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45"} Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.208251 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" event={"ID":"267c2a24-af2e-48c4-9101-fbf9bba26e67","Type":"ContainerStarted","Data":"a73accb674ecb494bc063d0cd0b4ea9e5abbb9f44b3344cc9a13110378ba6e51"} Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.208472 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.209865 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14a4ea42-438e-40ad-b3fa-71db5808ff98","Type":"ContainerStarted","Data":"36eb2c88ba5c1064fa1bb53585e7e861e35b0aa036ef84a6105a593a6f481952"} Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.258258 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.165461715 podStartE2EDuration="7.258238671s" podCreationTimestamp="2025-12-03 14:38:02 +0000 UTC" firstStartedPulling="2025-12-03 14:38:03.256990889 +0000 UTC m=+1490.245346106" lastFinishedPulling="2025-12-03 14:38:08.349767845 +0000 UTC m=+1495.338123062" observedRunningTime="2025-12-03 14:38:09.237755607 +0000 UTC m=+1496.226110824" watchObservedRunningTime="2025-12-03 14:38:09.258238671 +0000 UTC m=+1496.246593888" Dec 03 14:38:09 crc kubenswrapper[4751]: I1203 14:38:09.276280 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" podStartSLOduration=7.276255169 podStartE2EDuration="7.276255169s" podCreationTimestamp="2025-12-03 14:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:09.272387916 +0000 UTC m=+1496.260743143" watchObservedRunningTime="2025-12-03 14:38:09.276255169 +0000 UTC m=+1496.264610386" Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.220402 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c57ba59a-8492-439e-8383-89298aa3c6ed","Type":"ContainerStarted","Data":"9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248"} Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.220817 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerName="nova-metadata-log" containerID="cri-o://ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f" gracePeriod=30 Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.221273 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerName="nova-metadata-metadata" containerID="cri-o://9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248" gracePeriod=30 Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.225442 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14a4ea42-438e-40ad-b3fa-71db5808ff98","Type":"ContainerStarted","Data":"fb6dc7ade032c94c94351904e8badbdf803c28db6fd66fb458b2afb68038c023"} Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.262577 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.450777336 podStartE2EDuration="8.262553139s" podCreationTimestamp="2025-12-03 14:38:02 +0000 UTC" firstStartedPulling="2025-12-03 14:38:03.565485411 +0000 UTC m=+1490.553840628" lastFinishedPulling="2025-12-03 14:38:08.377261204 +0000 UTC m=+1495.365616431" observedRunningTime="2025-12-03 14:38:10.253272973 +0000 UTC m=+1497.241628190" watchObservedRunningTime="2025-12-03 14:38:10.262553139 +0000 UTC m=+1497.250908356" Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.284676 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.042464306 podStartE2EDuration="9.284649745s" podCreationTimestamp="2025-12-03 14:38:01 +0000 UTC" firstStartedPulling="2025-12-03 14:38:03.134102949 +0000 UTC m=+1490.122458166" lastFinishedPulling="2025-12-03 14:38:08.376288388 +0000 UTC m=+1495.364643605" observedRunningTime="2025-12-03 14:38:10.271778373 +0000 UTC m=+1497.260133590" watchObservedRunningTime="2025-12-03 14:38:10.284649745 +0000 UTC m=+1497.273004962" Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.898269 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.948019 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-config-data\") pod \"c57ba59a-8492-439e-8383-89298aa3c6ed\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.948604 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lx6l\" (UniqueName: \"kubernetes.io/projected/c57ba59a-8492-439e-8383-89298aa3c6ed-kube-api-access-5lx6l\") pod \"c57ba59a-8492-439e-8383-89298aa3c6ed\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.948697 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-combined-ca-bundle\") pod \"c57ba59a-8492-439e-8383-89298aa3c6ed\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.948775 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ba59a-8492-439e-8383-89298aa3c6ed-logs\") pod \"c57ba59a-8492-439e-8383-89298aa3c6ed\" (UID: \"c57ba59a-8492-439e-8383-89298aa3c6ed\") " Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.949158 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c57ba59a-8492-439e-8383-89298aa3c6ed-logs" (OuterVolumeSpecName: "logs") pod "c57ba59a-8492-439e-8383-89298aa3c6ed" (UID: "c57ba59a-8492-439e-8383-89298aa3c6ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.954630 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57ba59a-8492-439e-8383-89298aa3c6ed-kube-api-access-5lx6l" (OuterVolumeSpecName: "kube-api-access-5lx6l") pod "c57ba59a-8492-439e-8383-89298aa3c6ed" (UID: "c57ba59a-8492-439e-8383-89298aa3c6ed"). InnerVolumeSpecName "kube-api-access-5lx6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:10 crc kubenswrapper[4751]: I1203 14:38:10.985485 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c57ba59a-8492-439e-8383-89298aa3c6ed" (UID: "c57ba59a-8492-439e-8383-89298aa3c6ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.013494 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-config-data" (OuterVolumeSpecName: "config-data") pod "c57ba59a-8492-439e-8383-89298aa3c6ed" (UID: "c57ba59a-8492-439e-8383-89298aa3c6ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.050685 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.050730 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lx6l\" (UniqueName: \"kubernetes.io/projected/c57ba59a-8492-439e-8383-89298aa3c6ed-kube-api-access-5lx6l\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.050763 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ba59a-8492-439e-8383-89298aa3c6ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.050773 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ba59a-8492-439e-8383-89298aa3c6ed-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.237135 4751 generic.go:334] "Generic (PLEG): container finished" podID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerID="9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248" exitCode=0 Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.237173 4751 generic.go:334] "Generic (PLEG): container finished" podID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerID="ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f" exitCode=143 Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.237182 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c57ba59a-8492-439e-8383-89298aa3c6ed","Type":"ContainerDied","Data":"9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248"} Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.237229 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c57ba59a-8492-439e-8383-89298aa3c6ed","Type":"ContainerDied","Data":"ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f"} Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.237241 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c57ba59a-8492-439e-8383-89298aa3c6ed","Type":"ContainerDied","Data":"7acf59737fd22a2cc5d98a1f7beb16c08599f77b00af1efa346b282ef9da99ad"} Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.237236 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.237260 4751 scope.go:117] "RemoveContainer" containerID="9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.277079 4751 scope.go:117] "RemoveContainer" containerID="ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.280742 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.292503 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.300662 4751 scope.go:117] "RemoveContainer" containerID="9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248" Dec 03 14:38:11 crc kubenswrapper[4751]: E1203 14:38:11.301202 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248\": container with ID starting with 9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248 not found: ID does not exist" containerID="9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.301239 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248"} err="failed to get container status \"9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248\": rpc error: code = NotFound desc = could not find container \"9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248\": container with ID starting with 9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248 not found: ID does not exist" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.301267 4751 scope.go:117] "RemoveContainer" containerID="ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f" Dec 03 14:38:11 crc kubenswrapper[4751]: E1203 14:38:11.301620 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f\": container with ID starting with ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f not found: ID does not exist" containerID="ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.301668 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f"} err="failed to get container status \"ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f\": rpc error: code = NotFound desc = could not find container \"ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f\": container with ID starting with ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f not found: ID does not exist" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.301702 4751 scope.go:117] "RemoveContainer" containerID="9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.302059 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248"} err="failed to get container status \"9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248\": rpc error: code = NotFound desc = could not find container \"9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248\": container with ID starting with 9363faffe35965fa771db4038c4d40012319abae43c708754565e71efa670248 not found: ID does not exist" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.302108 4751 scope.go:117] "RemoveContainer" containerID="ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.302431 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f"} err="failed to get container status \"ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f\": rpc error: code = NotFound desc = could not find container \"ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f\": container with ID starting with ff9949487992251806468d6ba9a06a297ca6f9b4e82778c1a4c2823d4f03734f not found: ID does not exist" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.353953 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57ba59a-8492-439e-8383-89298aa3c6ed" path="/var/lib/kubelet/pods/c57ba59a-8492-439e-8383-89298aa3c6ed/volumes" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.356581 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:11 crc kubenswrapper[4751]: E1203 14:38:11.357062 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerName="nova-metadata-metadata" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.357083 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerName="nova-metadata-metadata" Dec 03 14:38:11 crc kubenswrapper[4751]: E1203 14:38:11.357115 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerName="nova-metadata-log" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.357122 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerName="nova-metadata-log" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.359551 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerName="nova-metadata-log" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.359588 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57ba59a-8492-439e-8383-89298aa3c6ed" containerName="nova-metadata-metadata" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.363412 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.363575 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.365973 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.366173 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.457859 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.457918 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.457940 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7e03c08-ba92-441c-a040-fa1e498c9c4d-logs\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.458086 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-config-data\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.458176 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w22t\" (UniqueName: \"kubernetes.io/projected/f7e03c08-ba92-441c-a040-fa1e498c9c4d-kube-api-access-4w22t\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.560378 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.560695 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.560719 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7e03c08-ba92-441c-a040-fa1e498c9c4d-logs\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.560772 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-config-data\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.560807 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w22t\" (UniqueName: \"kubernetes.io/projected/f7e03c08-ba92-441c-a040-fa1e498c9c4d-kube-api-access-4w22t\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.561489 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7e03c08-ba92-441c-a040-fa1e498c9c4d-logs\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.565550 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-config-data\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.565589 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.566403 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.593852 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w22t\" (UniqueName: \"kubernetes.io/projected/f7e03c08-ba92-441c-a040-fa1e498c9c4d-kube-api-access-4w22t\") pod \"nova-metadata-0\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " pod="openstack/nova-metadata-0" Dec 03 14:38:11 crc kubenswrapper[4751]: I1203 14:38:11.718214 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:12 crc kubenswrapper[4751]: W1203 14:38:12.207596 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7e03c08_ba92_441c_a040_fa1e498c9c4d.slice/crio-52dd9827ccfbdb92d8d701cd2e2224fce755582db6c9817e1a8a21ee434d5638 WatchSource:0}: Error finding container 52dd9827ccfbdb92d8d701cd2e2224fce755582db6c9817e1a8a21ee434d5638: Status 404 returned error can't find the container with id 52dd9827ccfbdb92d8d701cd2e2224fce755582db6c9817e1a8a21ee434d5638 Dec 03 14:38:12 crc kubenswrapper[4751]: I1203 14:38:12.214429 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:12 crc kubenswrapper[4751]: I1203 14:38:12.249980 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7e03c08-ba92-441c-a040-fa1e498c9c4d","Type":"ContainerStarted","Data":"52dd9827ccfbdb92d8d701cd2e2224fce755582db6c9817e1a8a21ee434d5638"} Dec 03 14:38:12 crc kubenswrapper[4751]: I1203 14:38:12.412380 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:38:12 crc kubenswrapper[4751]: I1203 14:38:12.412525 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:38:12 crc kubenswrapper[4751]: I1203 14:38:12.578921 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 14:38:12 crc kubenswrapper[4751]: I1203 14:38:12.579254 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 14:38:12 crc kubenswrapper[4751]: I1203 14:38:12.614847 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 14:38:12 crc kubenswrapper[4751]: I1203 14:38:12.703733 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:13 crc kubenswrapper[4751]: I1203 14:38:13.266419 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7e03c08-ba92-441c-a040-fa1e498c9c4d","Type":"ContainerStarted","Data":"322281b17854e365594fc73234be9680fba78e48a9992ba1e632151d1927ed64"} Dec 03 14:38:13 crc kubenswrapper[4751]: I1203 14:38:13.266774 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7e03c08-ba92-441c-a040-fa1e498c9c4d","Type":"ContainerStarted","Data":"be3721df0ddb932a059080d38f7e97401a229476a16f03d70b4381c496f0a76f"} Dec 03 14:38:13 crc kubenswrapper[4751]: I1203 14:38:13.269696 4751 generic.go:334] "Generic (PLEG): container finished" podID="57bcc78c-1540-47ee-82f6-664aff4f6216" containerID="1f0e9aff22382cad550073805ca2c207dcafb0ce675cc850c7b00be3dd9b6a89" exitCode=0 Dec 03 14:38:13 crc kubenswrapper[4751]: I1203 14:38:13.269784 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ccbpw" event={"ID":"57bcc78c-1540-47ee-82f6-664aff4f6216","Type":"ContainerDied","Data":"1f0e9aff22382cad550073805ca2c207dcafb0ce675cc850c7b00be3dd9b6a89"} Dec 03 14:38:13 crc kubenswrapper[4751]: I1203 14:38:13.329616 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.329598286 podStartE2EDuration="2.329598286s" podCreationTimestamp="2025-12-03 14:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:13.321856381 +0000 UTC m=+1500.310211618" watchObservedRunningTime="2025-12-03 14:38:13.329598286 +0000 UTC m=+1500.317953503" Dec 03 14:38:13 crc kubenswrapper[4751]: I1203 14:38:13.395488 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 14:38:13 crc kubenswrapper[4751]: I1203 14:38:13.496601 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:38:13 crc kubenswrapper[4751]: I1203 14:38:13.496993 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.280389 4751 generic.go:334] "Generic (PLEG): container finished" podID="528661c5-7b80-48d3-b8fd-7d20c23932f7" containerID="74931be11a9b42eb7d9b58dba9ad8007cc1df42f646f017355650aac7bedb0ba" exitCode=0 Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.280472 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lhljr" event={"ID":"528661c5-7b80-48d3-b8fd-7d20c23932f7","Type":"ContainerDied","Data":"74931be11a9b42eb7d9b58dba9ad8007cc1df42f646f017355650aac7bedb0ba"} Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.779091 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.847032 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-combined-ca-bundle\") pod \"57bcc78c-1540-47ee-82f6-664aff4f6216\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.847396 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-config-data\") pod \"57bcc78c-1540-47ee-82f6-664aff4f6216\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.847570 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-scripts\") pod \"57bcc78c-1540-47ee-82f6-664aff4f6216\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.847700 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg2c4\" (UniqueName: \"kubernetes.io/projected/57bcc78c-1540-47ee-82f6-664aff4f6216-kube-api-access-xg2c4\") pod \"57bcc78c-1540-47ee-82f6-664aff4f6216\" (UID: \"57bcc78c-1540-47ee-82f6-664aff4f6216\") " Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.859550 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57bcc78c-1540-47ee-82f6-664aff4f6216-kube-api-access-xg2c4" (OuterVolumeSpecName: "kube-api-access-xg2c4") pod "57bcc78c-1540-47ee-82f6-664aff4f6216" (UID: "57bcc78c-1540-47ee-82f6-664aff4f6216"). InnerVolumeSpecName "kube-api-access-xg2c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.865500 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-scripts" (OuterVolumeSpecName: "scripts") pod "57bcc78c-1540-47ee-82f6-664aff4f6216" (UID: "57bcc78c-1540-47ee-82f6-664aff4f6216"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.900069 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57bcc78c-1540-47ee-82f6-664aff4f6216" (UID: "57bcc78c-1540-47ee-82f6-664aff4f6216"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.921361 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-config-data" (OuterVolumeSpecName: "config-data") pod "57bcc78c-1540-47ee-82f6-664aff4f6216" (UID: "57bcc78c-1540-47ee-82f6-664aff4f6216"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.964066 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.964101 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg2c4\" (UniqueName: \"kubernetes.io/projected/57bcc78c-1540-47ee-82f6-664aff4f6216-kube-api-access-xg2c4\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.964112 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:14 crc kubenswrapper[4751]: I1203 14:38:14.964124 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57bcc78c-1540-47ee-82f6-664aff4f6216-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.296118 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ccbpw" event={"ID":"57bcc78c-1540-47ee-82f6-664aff4f6216","Type":"ContainerDied","Data":"2e815fbb08a821b6c1eb56d0aec40a285817f6be3e6d05d48107740f171b9c94"} Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.296171 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e815fbb08a821b6c1eb56d0aec40a285817f6be3e6d05d48107740f171b9c94" Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.296505 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ccbpw" Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.603784 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.603950 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b6ba00e8-4494-44a7-9042-8286d9a0d6b1" containerName="nova-scheduler-scheduler" containerID="cri-o://f1150d0b4a2de00be8031fc5ac97d6bf2fd2427c1e564a0fcf8018d0f3bcc837" gracePeriod=30 Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.631203 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.631460 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-log" containerID="cri-o://36eb2c88ba5c1064fa1bb53585e7e861e35b0aa036ef84a6105a593a6f481952" gracePeriod=30 Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.631893 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-api" containerID="cri-o://fb6dc7ade032c94c94351904e8badbdf803c28db6fd66fb458b2afb68038c023" gracePeriod=30 Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.653245 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.653516 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerName="nova-metadata-log" containerID="cri-o://be3721df0ddb932a059080d38f7e97401a229476a16f03d70b4381c496f0a76f" gracePeriod=30 Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.654131 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerName="nova-metadata-metadata" containerID="cri-o://322281b17854e365594fc73234be9680fba78e48a9992ba1e632151d1927ed64" gracePeriod=30 Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.967161 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.987649 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwqrq\" (UniqueName: \"kubernetes.io/projected/528661c5-7b80-48d3-b8fd-7d20c23932f7-kube-api-access-nwqrq\") pod \"528661c5-7b80-48d3-b8fd-7d20c23932f7\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.987721 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-combined-ca-bundle\") pod \"528661c5-7b80-48d3-b8fd-7d20c23932f7\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.987792 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-scripts\") pod \"528661c5-7b80-48d3-b8fd-7d20c23932f7\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.987946 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-config-data\") pod \"528661c5-7b80-48d3-b8fd-7d20c23932f7\" (UID: \"528661c5-7b80-48d3-b8fd-7d20c23932f7\") " Dec 03 14:38:15 crc kubenswrapper[4751]: I1203 14:38:15.994241 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528661c5-7b80-48d3-b8fd-7d20c23932f7-kube-api-access-nwqrq" (OuterVolumeSpecName: "kube-api-access-nwqrq") pod "528661c5-7b80-48d3-b8fd-7d20c23932f7" (UID: "528661c5-7b80-48d3-b8fd-7d20c23932f7"). InnerVolumeSpecName "kube-api-access-nwqrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.033437 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-scripts" (OuterVolumeSpecName: "scripts") pod "528661c5-7b80-48d3-b8fd-7d20c23932f7" (UID: "528661c5-7b80-48d3-b8fd-7d20c23932f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.056304 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "528661c5-7b80-48d3-b8fd-7d20c23932f7" (UID: "528661c5-7b80-48d3-b8fd-7d20c23932f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.060101 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-config-data" (OuterVolumeSpecName: "config-data") pod "528661c5-7b80-48d3-b8fd-7d20c23932f7" (UID: "528661c5-7b80-48d3-b8fd-7d20c23932f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.092207 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwqrq\" (UniqueName: \"kubernetes.io/projected/528661c5-7b80-48d3-b8fd-7d20c23932f7-kube-api-access-nwqrq\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.092257 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.092270 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.092282 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528661c5-7b80-48d3-b8fd-7d20c23932f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.310019 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lhljr" event={"ID":"528661c5-7b80-48d3-b8fd-7d20c23932f7","Type":"ContainerDied","Data":"972159480fb5a96a65b66f2d0f274f39ecb1cd5ee2297c706fe71bbcedde13bb"} Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.310080 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="972159480fb5a96a65b66f2d0f274f39ecb1cd5ee2297c706fe71bbcedde13bb" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.310046 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lhljr" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.313005 4751 generic.go:334] "Generic (PLEG): container finished" podID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerID="36eb2c88ba5c1064fa1bb53585e7e861e35b0aa036ef84a6105a593a6f481952" exitCode=143 Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.313053 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14a4ea42-438e-40ad-b3fa-71db5808ff98","Type":"ContainerDied","Data":"36eb2c88ba5c1064fa1bb53585e7e861e35b0aa036ef84a6105a593a6f481952"} Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.315805 4751 generic.go:334] "Generic (PLEG): container finished" podID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerID="322281b17854e365594fc73234be9680fba78e48a9992ba1e632151d1927ed64" exitCode=0 Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.315821 4751 generic.go:334] "Generic (PLEG): container finished" podID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerID="be3721df0ddb932a059080d38f7e97401a229476a16f03d70b4381c496f0a76f" exitCode=143 Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.315835 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7e03c08-ba92-441c-a040-fa1e498c9c4d","Type":"ContainerDied","Data":"322281b17854e365594fc73234be9680fba78e48a9992ba1e632151d1927ed64"} Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.315850 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7e03c08-ba92-441c-a040-fa1e498c9c4d","Type":"ContainerDied","Data":"be3721df0ddb932a059080d38f7e97401a229476a16f03d70b4381c496f0a76f"} Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.424503 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 14:38:16 crc kubenswrapper[4751]: E1203 14:38:16.425230 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528661c5-7b80-48d3-b8fd-7d20c23932f7" containerName="nova-cell1-conductor-db-sync" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.425246 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="528661c5-7b80-48d3-b8fd-7d20c23932f7" containerName="nova-cell1-conductor-db-sync" Dec 03 14:38:16 crc kubenswrapper[4751]: E1203 14:38:16.425292 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bcc78c-1540-47ee-82f6-664aff4f6216" containerName="nova-manage" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.425300 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bcc78c-1540-47ee-82f6-664aff4f6216" containerName="nova-manage" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.425485 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="57bcc78c-1540-47ee-82f6-664aff4f6216" containerName="nova-manage" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.425513 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="528661c5-7b80-48d3-b8fd-7d20c23932f7" containerName="nova-cell1-conductor-db-sync" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.426288 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.427435 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.428446 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.442763 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.500733 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-config-data\") pod \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.500936 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-nova-metadata-tls-certs\") pod \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.501044 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-combined-ca-bundle\") pod \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.501099 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w22t\" (UniqueName: \"kubernetes.io/projected/f7e03c08-ba92-441c-a040-fa1e498c9c4d-kube-api-access-4w22t\") pod \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.501179 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7e03c08-ba92-441c-a040-fa1e498c9c4d-logs\") pod \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\" (UID: \"f7e03c08-ba92-441c-a040-fa1e498c9c4d\") " Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.501536 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2chx9\" (UniqueName: \"kubernetes.io/projected/730eedc8-ac64-4f53-80d0-ec824459f08c-kube-api-access-2chx9\") pod \"nova-cell1-conductor-0\" (UID: \"730eedc8-ac64-4f53-80d0-ec824459f08c\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.501575 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730eedc8-ac64-4f53-80d0-ec824459f08c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"730eedc8-ac64-4f53-80d0-ec824459f08c\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.501735 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7e03c08-ba92-441c-a040-fa1e498c9c4d-logs" (OuterVolumeSpecName: "logs") pod "f7e03c08-ba92-441c-a040-fa1e498c9c4d" (UID: "f7e03c08-ba92-441c-a040-fa1e498c9c4d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.501902 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/730eedc8-ac64-4f53-80d0-ec824459f08c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"730eedc8-ac64-4f53-80d0-ec824459f08c\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.502086 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7e03c08-ba92-441c-a040-fa1e498c9c4d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.519548 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e03c08-ba92-441c-a040-fa1e498c9c4d-kube-api-access-4w22t" (OuterVolumeSpecName: "kube-api-access-4w22t") pod "f7e03c08-ba92-441c-a040-fa1e498c9c4d" (UID: "f7e03c08-ba92-441c-a040-fa1e498c9c4d"). InnerVolumeSpecName "kube-api-access-4w22t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.536178 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-config-data" (OuterVolumeSpecName: "config-data") pod "f7e03c08-ba92-441c-a040-fa1e498c9c4d" (UID: "f7e03c08-ba92-441c-a040-fa1e498c9c4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.552342 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7e03c08-ba92-441c-a040-fa1e498c9c4d" (UID: "f7e03c08-ba92-441c-a040-fa1e498c9c4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.563358 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f7e03c08-ba92-441c-a040-fa1e498c9c4d" (UID: "f7e03c08-ba92-441c-a040-fa1e498c9c4d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.605185 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2chx9\" (UniqueName: \"kubernetes.io/projected/730eedc8-ac64-4f53-80d0-ec824459f08c-kube-api-access-2chx9\") pod \"nova-cell1-conductor-0\" (UID: \"730eedc8-ac64-4f53-80d0-ec824459f08c\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.605248 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730eedc8-ac64-4f53-80d0-ec824459f08c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"730eedc8-ac64-4f53-80d0-ec824459f08c\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.605491 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/730eedc8-ac64-4f53-80d0-ec824459f08c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"730eedc8-ac64-4f53-80d0-ec824459f08c\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.605560 4751 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.605581 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.605591 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w22t\" (UniqueName: \"kubernetes.io/projected/f7e03c08-ba92-441c-a040-fa1e498c9c4d-kube-api-access-4w22t\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.605601 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e03c08-ba92-441c-a040-fa1e498c9c4d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.609887 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/730eedc8-ac64-4f53-80d0-ec824459f08c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"730eedc8-ac64-4f53-80d0-ec824459f08c\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.618220 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730eedc8-ac64-4f53-80d0-ec824459f08c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"730eedc8-ac64-4f53-80d0-ec824459f08c\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.623848 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2chx9\" (UniqueName: \"kubernetes.io/projected/730eedc8-ac64-4f53-80d0-ec824459f08c-kube-api-access-2chx9\") pod \"nova-cell1-conductor-0\" (UID: \"730eedc8-ac64-4f53-80d0-ec824459f08c\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:16 crc kubenswrapper[4751]: I1203 14:38:16.749315 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.277849 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.357849 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"730eedc8-ac64-4f53-80d0-ec824459f08c","Type":"ContainerStarted","Data":"f24f7ee22450a7c8a293c7ead3282ff2880b1a96a08cc9ee02a47e8ba94bab0c"} Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.362431 4751 generic.go:334] "Generic (PLEG): container finished" podID="b6ba00e8-4494-44a7-9042-8286d9a0d6b1" containerID="f1150d0b4a2de00be8031fc5ac97d6bf2fd2427c1e564a0fcf8018d0f3bcc837" exitCode=0 Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.362494 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b6ba00e8-4494-44a7-9042-8286d9a0d6b1","Type":"ContainerDied","Data":"f1150d0b4a2de00be8031fc5ac97d6bf2fd2427c1e564a0fcf8018d0f3bcc837"} Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.364155 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7e03c08-ba92-441c-a040-fa1e498c9c4d","Type":"ContainerDied","Data":"52dd9827ccfbdb92d8d701cd2e2224fce755582db6c9817e1a8a21ee434d5638"} Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.364187 4751 scope.go:117] "RemoveContainer" containerID="322281b17854e365594fc73234be9680fba78e48a9992ba1e632151d1927ed64" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.364310 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.403725 4751 scope.go:117] "RemoveContainer" containerID="be3721df0ddb932a059080d38f7e97401a229476a16f03d70b4381c496f0a76f" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.414530 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.431919 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.449718 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:17 crc kubenswrapper[4751]: E1203 14:38:17.450174 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerName="nova-metadata-metadata" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.450189 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerName="nova-metadata-metadata" Dec 03 14:38:17 crc kubenswrapper[4751]: E1203 14:38:17.450203 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerName="nova-metadata-log" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.450211 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerName="nova-metadata-log" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.451527 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerName="nova-metadata-metadata" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.451665 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" containerName="nova-metadata-log" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.454552 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.455635 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.457292 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.457810 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.465646 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.542159 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-combined-ca-bundle\") pod \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.542223 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-config-data\") pod \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.542438 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft7xk\" (UniqueName: \"kubernetes.io/projected/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-kube-api-access-ft7xk\") pod \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\" (UID: \"b6ba00e8-4494-44a7-9042-8286d9a0d6b1\") " Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.542852 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmdc6\" (UniqueName: \"kubernetes.io/projected/0a0018be-7a12-4006-ae3c-0d5b60837a95-kube-api-access-zmdc6\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.542894 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-config-data\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.542920 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0018be-7a12-4006-ae3c-0d5b60837a95-logs\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.542966 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.543053 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.572170 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-kube-api-access-ft7xk" (OuterVolumeSpecName: "kube-api-access-ft7xk") pod "b6ba00e8-4494-44a7-9042-8286d9a0d6b1" (UID: "b6ba00e8-4494-44a7-9042-8286d9a0d6b1"). InnerVolumeSpecName "kube-api-access-ft7xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.580292 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6ba00e8-4494-44a7-9042-8286d9a0d6b1" (UID: "b6ba00e8-4494-44a7-9042-8286d9a0d6b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.619769 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-config-data" (OuterVolumeSpecName: "config-data") pod "b6ba00e8-4494-44a7-9042-8286d9a0d6b1" (UID: "b6ba00e8-4494-44a7-9042-8286d9a0d6b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.644795 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.644956 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmdc6\" (UniqueName: \"kubernetes.io/projected/0a0018be-7a12-4006-ae3c-0d5b60837a95-kube-api-access-zmdc6\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.644994 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-config-data\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.645028 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0018be-7a12-4006-ae3c-0d5b60837a95-logs\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.645082 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.645179 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft7xk\" (UniqueName: \"kubernetes.io/projected/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-kube-api-access-ft7xk\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.645195 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.645205 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ba00e8-4494-44a7-9042-8286d9a0d6b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.650228 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.650748 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0018be-7a12-4006-ae3c-0d5b60837a95-logs\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.650893 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.654814 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-config-data\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.686909 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmdc6\" (UniqueName: \"kubernetes.io/projected/0a0018be-7a12-4006-ae3c-0d5b60837a95-kube-api-access-zmdc6\") pod \"nova-metadata-0\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.787850 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.856480 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.940967 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-28zsl"] Dec 03 14:38:17 crc kubenswrapper[4751]: I1203 14:38:17.942015 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" podUID="f7f79d9f-538b-49fc-858d-169a04c6819e" containerName="dnsmasq-dns" containerID="cri-o://2e4d51d2b8775307fa843835316554a7f0416e9814368080a84097f53368e022" gracePeriod=10 Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.335463 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:38:18 crc kubenswrapper[4751]: W1203 14:38:18.358879 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a0018be_7a12_4006_ae3c_0d5b60837a95.slice/crio-8a0d6f9780502bbd5df54061dc7d90e092573634bc4d7e2a0a2dad7f695bf441 WatchSource:0}: Error finding container 8a0d6f9780502bbd5df54061dc7d90e092573634bc4d7e2a0a2dad7f695bf441: Status 404 returned error can't find the container with id 8a0d6f9780502bbd5df54061dc7d90e092573634bc4d7e2a0a2dad7f695bf441 Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.359006 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" podUID="f7f79d9f-538b-49fc-858d-169a04c6819e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: connect: connection refused" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.382994 4751 generic.go:334] "Generic (PLEG): container finished" podID="f7f79d9f-538b-49fc-858d-169a04c6819e" containerID="2e4d51d2b8775307fa843835316554a7f0416e9814368080a84097f53368e022" exitCode=0 Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.383069 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" event={"ID":"f7f79d9f-538b-49fc-858d-169a04c6819e","Type":"ContainerDied","Data":"2e4d51d2b8775307fa843835316554a7f0416e9814368080a84097f53368e022"} Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.393535 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"730eedc8-ac64-4f53-80d0-ec824459f08c","Type":"ContainerStarted","Data":"bb0fa3829734fc9d6f260307df5560efc23a7cffa38162afee5f469a7210564a"} Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.393905 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.396292 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a0018be-7a12-4006-ae3c-0d5b60837a95","Type":"ContainerStarted","Data":"8a0d6f9780502bbd5df54061dc7d90e092573634bc4d7e2a0a2dad7f695bf441"} Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.416036 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b6ba00e8-4494-44a7-9042-8286d9a0d6b1","Type":"ContainerDied","Data":"635eea8707f0a4ad6419b5c5fd556faa1b3f15de6a26202427370809f230d6c7"} Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.416149 4751 scope.go:117] "RemoveContainer" containerID="f1150d0b4a2de00be8031fc5ac97d6bf2fd2427c1e564a0fcf8018d0f3bcc837" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.416414 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.432152 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.432129551 podStartE2EDuration="2.432129551s" podCreationTimestamp="2025-12-03 14:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:18.413931389 +0000 UTC m=+1505.402286626" watchObservedRunningTime="2025-12-03 14:38:18.432129551 +0000 UTC m=+1505.420484768" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.467099 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.480233 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.500410 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:38:18 crc kubenswrapper[4751]: E1203 14:38:18.500967 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ba00e8-4494-44a7-9042-8286d9a0d6b1" containerName="nova-scheduler-scheduler" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.501002 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ba00e8-4494-44a7-9042-8286d9a0d6b1" containerName="nova-scheduler-scheduler" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.501233 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ba00e8-4494-44a7-9042-8286d9a0d6b1" containerName="nova-scheduler-scheduler" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.502153 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.505976 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.514289 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.567164 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hrwp\" (UniqueName: \"kubernetes.io/projected/3451b19b-19d8-4a15-8bd7-920525f3335f-kube-api-access-9hrwp\") pod \"nova-scheduler-0\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.567302 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.567361 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-config-data\") pod \"nova-scheduler-0\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.669918 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hrwp\" (UniqueName: \"kubernetes.io/projected/3451b19b-19d8-4a15-8bd7-920525f3335f-kube-api-access-9hrwp\") pod \"nova-scheduler-0\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.670275 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.670308 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-config-data\") pod \"nova-scheduler-0\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.675318 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-config-data\") pod \"nova-scheduler-0\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.688973 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hrwp\" (UniqueName: \"kubernetes.io/projected/3451b19b-19d8-4a15-8bd7-920525f3335f-kube-api-access-9hrwp\") pod \"nova-scheduler-0\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.696616 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " pod="openstack/nova-scheduler-0" Dec 03 14:38:18 crc kubenswrapper[4751]: I1203 14:38:18.851858 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.094035 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.179136 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brv2z\" (UniqueName: \"kubernetes.io/projected/f7f79d9f-538b-49fc-858d-169a04c6819e-kube-api-access-brv2z\") pod \"f7f79d9f-538b-49fc-858d-169a04c6819e\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.179199 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-nb\") pod \"f7f79d9f-538b-49fc-858d-169a04c6819e\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.179219 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-config\") pod \"f7f79d9f-538b-49fc-858d-169a04c6819e\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.179413 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-swift-storage-0\") pod \"f7f79d9f-538b-49fc-858d-169a04c6819e\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.179440 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-svc\") pod \"f7f79d9f-538b-49fc-858d-169a04c6819e\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.179484 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-sb\") pod \"f7f79d9f-538b-49fc-858d-169a04c6819e\" (UID: \"f7f79d9f-538b-49fc-858d-169a04c6819e\") " Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.187875 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7f79d9f-538b-49fc-858d-169a04c6819e-kube-api-access-brv2z" (OuterVolumeSpecName: "kube-api-access-brv2z") pod "f7f79d9f-538b-49fc-858d-169a04c6819e" (UID: "f7f79d9f-538b-49fc-858d-169a04c6819e"). InnerVolumeSpecName "kube-api-access-brv2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.256058 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f7f79d9f-538b-49fc-858d-169a04c6819e" (UID: "f7f79d9f-538b-49fc-858d-169a04c6819e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.256688 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7f79d9f-538b-49fc-858d-169a04c6819e" (UID: "f7f79d9f-538b-49fc-858d-169a04c6819e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.282038 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7f79d9f-538b-49fc-858d-169a04c6819e" (UID: "f7f79d9f-538b-49fc-858d-169a04c6819e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.282537 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brv2z\" (UniqueName: \"kubernetes.io/projected/f7f79d9f-538b-49fc-858d-169a04c6819e-kube-api-access-brv2z\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.282569 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.282584 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.282593 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.292204 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7f79d9f-538b-49fc-858d-169a04c6819e" (UID: "f7f79d9f-538b-49fc-858d-169a04c6819e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.306857 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-config" (OuterVolumeSpecName: "config") pod "f7f79d9f-538b-49fc-858d-169a04c6819e" (UID: "f7f79d9f-538b-49fc-858d-169a04c6819e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.336459 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6ba00e8-4494-44a7-9042-8286d9a0d6b1" path="/var/lib/kubelet/pods/b6ba00e8-4494-44a7-9042-8286d9a0d6b1/volumes" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.337184 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e03c08-ba92-441c-a040-fa1e498c9c4d" path="/var/lib/kubelet/pods/f7e03c08-ba92-441c-a040-fa1e498c9c4d/volumes" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.384944 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.384981 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f79d9f-538b-49fc-858d-169a04c6819e-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.435348 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.436215 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-28zsl" event={"ID":"f7f79d9f-538b-49fc-858d-169a04c6819e","Type":"ContainerDied","Data":"d171b742d978f32332d8ef45ac8390aff21d3d44174bd25f46b897ab5b1ee7c7"} Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.436254 4751 scope.go:117] "RemoveContainer" containerID="2e4d51d2b8775307fa843835316554a7f0416e9814368080a84097f53368e022" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.442175 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.442477 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a0018be-7a12-4006-ae3c-0d5b60837a95","Type":"ContainerStarted","Data":"cadfa921d66bd20ee697ec890d6a9c90ade6f816a31c6012b9ec7a167cd2fb29"} Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.442501 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a0018be-7a12-4006-ae3c-0d5b60837a95","Type":"ContainerStarted","Data":"28ceaa88466c0c30df0cce89a61465f5879b57437200204b85f7092668787149"} Dec 03 14:38:19 crc kubenswrapper[4751]: W1203 14:38:19.448043 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3451b19b_19d8_4a15_8bd7_920525f3335f.slice/crio-c2b0c2aca4694db51ab09b6e38b07b9115189630864288bf12fb18916cf53ee8 WatchSource:0}: Error finding container c2b0c2aca4694db51ab09b6e38b07b9115189630864288bf12fb18916cf53ee8: Status 404 returned error can't find the container with id c2b0c2aca4694db51ab09b6e38b07b9115189630864288bf12fb18916cf53ee8 Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.478964 4751 scope.go:117] "RemoveContainer" containerID="27841f4fb675967760579ec19a6f06485f884e280b1acff64e5af196383c0698" Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.479159 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-28zsl"] Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.490827 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-28zsl"] Dec 03 14:38:19 crc kubenswrapper[4751]: I1203 14:38:19.499173 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.499150862 podStartE2EDuration="2.499150862s" podCreationTimestamp="2025-12-03 14:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:19.471642473 +0000 UTC m=+1506.459997720" watchObservedRunningTime="2025-12-03 14:38:19.499150862 +0000 UTC m=+1506.487506079" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.475284 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3451b19b-19d8-4a15-8bd7-920525f3335f","Type":"ContainerStarted","Data":"3a704fbee886f7b3a6aaddbec352b42e09fb12bb46e20a03e223b223bf68c150"} Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.475710 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3451b19b-19d8-4a15-8bd7-920525f3335f","Type":"ContainerStarted","Data":"c2b0c2aca4694db51ab09b6e38b07b9115189630864288bf12fb18916cf53ee8"} Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.478184 4751 generic.go:334] "Generic (PLEG): container finished" podID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerID="fb6dc7ade032c94c94351904e8badbdf803c28db6fd66fb458b2afb68038c023" exitCode=0 Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.478247 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14a4ea42-438e-40ad-b3fa-71db5808ff98","Type":"ContainerDied","Data":"fb6dc7ade032c94c94351904e8badbdf803c28db6fd66fb458b2afb68038c023"} Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.544067 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.544047597 podStartE2EDuration="2.544047597s" podCreationTimestamp="2025-12-03 14:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:20.518363805 +0000 UTC m=+1507.506719032" watchObservedRunningTime="2025-12-03 14:38:20.544047597 +0000 UTC m=+1507.532402814" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.728714 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.828581 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-config-data\") pod \"14a4ea42-438e-40ad-b3fa-71db5808ff98\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.828680 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a4ea42-438e-40ad-b3fa-71db5808ff98-logs\") pod \"14a4ea42-438e-40ad-b3fa-71db5808ff98\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.828715 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v4lg\" (UniqueName: \"kubernetes.io/projected/14a4ea42-438e-40ad-b3fa-71db5808ff98-kube-api-access-2v4lg\") pod \"14a4ea42-438e-40ad-b3fa-71db5808ff98\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.828766 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-combined-ca-bundle\") pod \"14a4ea42-438e-40ad-b3fa-71db5808ff98\" (UID: \"14a4ea42-438e-40ad-b3fa-71db5808ff98\") " Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.829074 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a4ea42-438e-40ad-b3fa-71db5808ff98-logs" (OuterVolumeSpecName: "logs") pod "14a4ea42-438e-40ad-b3fa-71db5808ff98" (UID: "14a4ea42-438e-40ad-b3fa-71db5808ff98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.829343 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a4ea42-438e-40ad-b3fa-71db5808ff98-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.833772 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a4ea42-438e-40ad-b3fa-71db5808ff98-kube-api-access-2v4lg" (OuterVolumeSpecName: "kube-api-access-2v4lg") pod "14a4ea42-438e-40ad-b3fa-71db5808ff98" (UID: "14a4ea42-438e-40ad-b3fa-71db5808ff98"). InnerVolumeSpecName "kube-api-access-2v4lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.858221 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-config-data" (OuterVolumeSpecName: "config-data") pod "14a4ea42-438e-40ad-b3fa-71db5808ff98" (UID: "14a4ea42-438e-40ad-b3fa-71db5808ff98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.864551 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14a4ea42-438e-40ad-b3fa-71db5808ff98" (UID: "14a4ea42-438e-40ad-b3fa-71db5808ff98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.931417 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.931450 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v4lg\" (UniqueName: \"kubernetes.io/projected/14a4ea42-438e-40ad-b3fa-71db5808ff98-kube-api-access-2v4lg\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:20 crc kubenswrapper[4751]: I1203 14:38:20.931472 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a4ea42-438e-40ad-b3fa-71db5808ff98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.204401 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.328676 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7f79d9f-538b-49fc-858d-169a04c6819e" path="/var/lib/kubelet/pods/f7f79d9f-538b-49fc-858d-169a04c6819e/volumes" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.492188 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.493169 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14a4ea42-438e-40ad-b3fa-71db5808ff98","Type":"ContainerDied","Data":"ae2a1b3ae3412712ba16a8daf8111d6c65de92b252ea83a58595c2001d621b2d"} Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.493206 4751 scope.go:117] "RemoveContainer" containerID="fb6dc7ade032c94c94351904e8badbdf803c28db6fd66fb458b2afb68038c023" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.531550 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.550999 4751 scope.go:117] "RemoveContainer" containerID="36eb2c88ba5c1064fa1bb53585e7e861e35b0aa036ef84a6105a593a6f481952" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.551447 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.577266 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:21 crc kubenswrapper[4751]: E1203 14:38:21.578381 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f79d9f-538b-49fc-858d-169a04c6819e" containerName="init" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.578534 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f79d9f-538b-49fc-858d-169a04c6819e" containerName="init" Dec 03 14:38:21 crc kubenswrapper[4751]: E1203 14:38:21.578626 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-log" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.578704 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-log" Dec 03 14:38:21 crc kubenswrapper[4751]: E1203 14:38:21.578800 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-api" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.578876 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-api" Dec 03 14:38:21 crc kubenswrapper[4751]: E1203 14:38:21.578963 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f79d9f-538b-49fc-858d-169a04c6819e" containerName="dnsmasq-dns" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.579106 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f79d9f-538b-49fc-858d-169a04c6819e" containerName="dnsmasq-dns" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.579607 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-log" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.579735 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" containerName="nova-api-api" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.579838 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f79d9f-538b-49fc-858d-169a04c6819e" containerName="dnsmasq-dns" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.586652 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.591811 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.592021 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.647025 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-logs\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.647385 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-config-data\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.647500 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58vc5\" (UniqueName: \"kubernetes.io/projected/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-kube-api-access-58vc5\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.647624 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.749750 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-config-data\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.749800 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58vc5\" (UniqueName: \"kubernetes.io/projected/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-kube-api-access-58vc5\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.749863 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.749982 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-logs\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.750400 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-logs\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.754987 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.755093 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-config-data\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.767796 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58vc5\" (UniqueName: \"kubernetes.io/projected/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-kube-api-access-58vc5\") pod \"nova-api-0\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " pod="openstack/nova-api-0" Dec 03 14:38:21 crc kubenswrapper[4751]: I1203 14:38:21.906204 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:22 crc kubenswrapper[4751]: I1203 14:38:22.377281 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:22 crc kubenswrapper[4751]: I1203 14:38:22.528806 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4c34aee-0ee0-45f5-b2ae-b87b966366ea","Type":"ContainerStarted","Data":"b2fae7995a401c633ed02c22c558125aac8677667e561f3a73b2a350d6665418"} Dec 03 14:38:22 crc kubenswrapper[4751]: I1203 14:38:22.787932 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:38:22 crc kubenswrapper[4751]: I1203 14:38:22.787990 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:38:23 crc kubenswrapper[4751]: I1203 14:38:23.326377 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a4ea42-438e-40ad-b3fa-71db5808ff98" path="/var/lib/kubelet/pods/14a4ea42-438e-40ad-b3fa-71db5808ff98/volumes" Dec 03 14:38:23 crc kubenswrapper[4751]: I1203 14:38:23.540170 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4c34aee-0ee0-45f5-b2ae-b87b966366ea","Type":"ContainerStarted","Data":"a7ca56ff46023a46ef741ce361b122aaa0fada509af6e5bf61dabcba9cc2c844"} Dec 03 14:38:23 crc kubenswrapper[4751]: I1203 14:38:23.540235 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4c34aee-0ee0-45f5-b2ae-b87b966366ea","Type":"ContainerStarted","Data":"63ec04aa3f42c0b0a0b61777c74dad59cb60e6f1b7df9eca9b5a55e8f07b4a47"} Dec 03 14:38:23 crc kubenswrapper[4751]: I1203 14:38:23.561629 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.561606382 podStartE2EDuration="2.561606382s" podCreationTimestamp="2025-12-03 14:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:23.557962005 +0000 UTC m=+1510.546317232" watchObservedRunningTime="2025-12-03 14:38:23.561606382 +0000 UTC m=+1510.549961599" Dec 03 14:38:23 crc kubenswrapper[4751]: I1203 14:38:23.852768 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 14:38:25 crc kubenswrapper[4751]: I1203 14:38:25.211435 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:38:25 crc kubenswrapper[4751]: I1203 14:38:25.211644 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="80f132c3-7b27-4d3d-950e-9c6aa887b6a7" containerName="kube-state-metrics" containerID="cri-o://dd048db6124028ffa9366bf48ace771c675db3313b51ccfaf4839b11002fcf56" gracePeriod=30 Dec 03 14:38:25 crc kubenswrapper[4751]: I1203 14:38:25.576207 4751 generic.go:334] "Generic (PLEG): container finished" podID="80f132c3-7b27-4d3d-950e-9c6aa887b6a7" containerID="dd048db6124028ffa9366bf48ace771c675db3313b51ccfaf4839b11002fcf56" exitCode=2 Dec 03 14:38:25 crc kubenswrapper[4751]: I1203 14:38:25.576592 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"80f132c3-7b27-4d3d-950e-9c6aa887b6a7","Type":"ContainerDied","Data":"dd048db6124028ffa9366bf48ace771c675db3313b51ccfaf4839b11002fcf56"} Dec 03 14:38:25 crc kubenswrapper[4751]: I1203 14:38:25.848306 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:38:25 crc kubenswrapper[4751]: I1203 14:38:25.953369 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbsbh\" (UniqueName: \"kubernetes.io/projected/80f132c3-7b27-4d3d-950e-9c6aa887b6a7-kube-api-access-pbsbh\") pod \"80f132c3-7b27-4d3d-950e-9c6aa887b6a7\" (UID: \"80f132c3-7b27-4d3d-950e-9c6aa887b6a7\") " Dec 03 14:38:25 crc kubenswrapper[4751]: I1203 14:38:25.962430 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f132c3-7b27-4d3d-950e-9c6aa887b6a7-kube-api-access-pbsbh" (OuterVolumeSpecName: "kube-api-access-pbsbh") pod "80f132c3-7b27-4d3d-950e-9c6aa887b6a7" (UID: "80f132c3-7b27-4d3d-950e-9c6aa887b6a7"). InnerVolumeSpecName "kube-api-access-pbsbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.056289 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbsbh\" (UniqueName: \"kubernetes.io/projected/80f132c3-7b27-4d3d-950e-9c6aa887b6a7-kube-api-access-pbsbh\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.588656 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"80f132c3-7b27-4d3d-950e-9c6aa887b6a7","Type":"ContainerDied","Data":"17ecc36cc88f6b75871140183dc5b60c8a66fa6f3f7bbf503e944444b8215309"} Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.588721 4751 scope.go:117] "RemoveContainer" containerID="dd048db6124028ffa9366bf48ace771c675db3313b51ccfaf4839b11002fcf56" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.588810 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.632894 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.650205 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.665246 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:38:26 crc kubenswrapper[4751]: E1203 14:38:26.665823 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f132c3-7b27-4d3d-950e-9c6aa887b6a7" containerName="kube-state-metrics" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.665841 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f132c3-7b27-4d3d-950e-9c6aa887b6a7" containerName="kube-state-metrics" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.666034 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f132c3-7b27-4d3d-950e-9c6aa887b6a7" containerName="kube-state-metrics" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.666880 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.671298 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.678786 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.712228 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.773265 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.773748 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.773781 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.773883 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb72m\" (UniqueName: \"kubernetes.io/projected/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-kube-api-access-nb72m\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.799137 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.877599 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.877770 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.877800 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.877912 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb72m\" (UniqueName: \"kubernetes.io/projected/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-kube-api-access-nb72m\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.888032 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.888671 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.898760 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.909887 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb72m\" (UniqueName: \"kubernetes.io/projected/b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2-kube-api-access-nb72m\") pod \"kube-state-metrics-0\" (UID: \"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2\") " pod="openstack/kube-state-metrics-0" Dec 03 14:38:26 crc kubenswrapper[4751]: I1203 14:38:26.992740 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:38:27 crc kubenswrapper[4751]: I1203 14:38:27.351745 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f132c3-7b27-4d3d-950e-9c6aa887b6a7" path="/var/lib/kubelet/pods/80f132c3-7b27-4d3d-950e-9c6aa887b6a7/volumes" Dec 03 14:38:27 crc kubenswrapper[4751]: I1203 14:38:27.623002 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:38:27 crc kubenswrapper[4751]: W1203 14:38:27.627789 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1ba2fcd_b1a4_42c8_a3a1_f84f3e198de2.slice/crio-5c4d1bbb6cf554a13f89becfc7f8e22863d77c3c20c6f981f0c24cef0cad77dd WatchSource:0}: Error finding container 5c4d1bbb6cf554a13f89becfc7f8e22863d77c3c20c6f981f0c24cef0cad77dd: Status 404 returned error can't find the container with id 5c4d1bbb6cf554a13f89becfc7f8e22863d77c3c20c6f981f0c24cef0cad77dd Dec 03 14:38:27 crc kubenswrapper[4751]: I1203 14:38:27.712023 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:27 crc kubenswrapper[4751]: I1203 14:38:27.712374 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="ceilometer-central-agent" containerID="cri-o://59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290" gracePeriod=30 Dec 03 14:38:27 crc kubenswrapper[4751]: I1203 14:38:27.712479 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="ceilometer-notification-agent" containerID="cri-o://f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562" gracePeriod=30 Dec 03 14:38:27 crc kubenswrapper[4751]: I1203 14:38:27.712533 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="proxy-httpd" containerID="cri-o://e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40" gracePeriod=30 Dec 03 14:38:27 crc kubenswrapper[4751]: I1203 14:38:27.712463 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="sg-core" containerID="cri-o://e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2" gracePeriod=30 Dec 03 14:38:27 crc kubenswrapper[4751]: I1203 14:38:27.788566 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 14:38:27 crc kubenswrapper[4751]: I1203 14:38:27.789358 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 14:38:28 crc kubenswrapper[4751]: I1203 14:38:28.613798 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2","Type":"ContainerStarted","Data":"5c4d1bbb6cf554a13f89becfc7f8e22863d77c3c20c6f981f0c24cef0cad77dd"} Dec 03 14:38:28 crc kubenswrapper[4751]: I1203 14:38:28.616006 4751 generic.go:334] "Generic (PLEG): container finished" podID="a9f00240-53f7-417a-9403-112cb396c30f" containerID="e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40" exitCode=0 Dec 03 14:38:28 crc kubenswrapper[4751]: I1203 14:38:28.616030 4751 generic.go:334] "Generic (PLEG): container finished" podID="a9f00240-53f7-417a-9403-112cb396c30f" containerID="e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2" exitCode=2 Dec 03 14:38:28 crc kubenswrapper[4751]: I1203 14:38:28.617010 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerDied","Data":"e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40"} Dec 03 14:38:28 crc kubenswrapper[4751]: I1203 14:38:28.617036 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerDied","Data":"e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2"} Dec 03 14:38:28 crc kubenswrapper[4751]: I1203 14:38:28.800535 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:38:28 crc kubenswrapper[4751]: I1203 14:38:28.800561 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:38:28 crc kubenswrapper[4751]: I1203 14:38:28.853147 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 14:38:28 crc kubenswrapper[4751]: I1203 14:38:28.900129 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 14:38:29 crc kubenswrapper[4751]: I1203 14:38:29.663854 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 14:38:31 crc kubenswrapper[4751]: I1203 14:38:31.906936 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:38:31 crc kubenswrapper[4751]: I1203 14:38:31.907307 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:38:32 crc kubenswrapper[4751]: I1203 14:38:32.990534 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.223:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:38:32 crc kubenswrapper[4751]: I1203 14:38:32.990639 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.223:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:38:33 crc kubenswrapper[4751]: I1203 14:38:33.779039 4751 generic.go:334] "Generic (PLEG): container finished" podID="a9f00240-53f7-417a-9403-112cb396c30f" containerID="59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290" exitCode=0 Dec 03 14:38:33 crc kubenswrapper[4751]: I1203 14:38:33.779087 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerDied","Data":"59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290"} Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.712283 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.740027 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-scripts\") pod \"a9f00240-53f7-417a-9403-112cb396c30f\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.740211 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-config-data\") pod \"a9f00240-53f7-417a-9403-112cb396c30f\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.740268 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-log-httpd\") pod \"a9f00240-53f7-417a-9403-112cb396c30f\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.740285 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-sg-core-conf-yaml\") pod \"a9f00240-53f7-417a-9403-112cb396c30f\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.740305 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-run-httpd\") pod \"a9f00240-53f7-417a-9403-112cb396c30f\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.740382 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj5cc\" (UniqueName: \"kubernetes.io/projected/a9f00240-53f7-417a-9403-112cb396c30f-kube-api-access-lj5cc\") pod \"a9f00240-53f7-417a-9403-112cb396c30f\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.740449 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-combined-ca-bundle\") pod \"a9f00240-53f7-417a-9403-112cb396c30f\" (UID: \"a9f00240-53f7-417a-9403-112cb396c30f\") " Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.740884 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a9f00240-53f7-417a-9403-112cb396c30f" (UID: "a9f00240-53f7-417a-9403-112cb396c30f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.741072 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.741458 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a9f00240-53f7-417a-9403-112cb396c30f" (UID: "a9f00240-53f7-417a-9403-112cb396c30f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.746151 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-scripts" (OuterVolumeSpecName: "scripts") pod "a9f00240-53f7-417a-9403-112cb396c30f" (UID: "a9f00240-53f7-417a-9403-112cb396c30f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.751746 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f00240-53f7-417a-9403-112cb396c30f-kube-api-access-lj5cc" (OuterVolumeSpecName: "kube-api-access-lj5cc") pod "a9f00240-53f7-417a-9403-112cb396c30f" (UID: "a9f00240-53f7-417a-9403-112cb396c30f"). InnerVolumeSpecName "kube-api-access-lj5cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.779143 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a9f00240-53f7-417a-9403-112cb396c30f" (UID: "a9f00240-53f7-417a-9403-112cb396c30f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.797882 4751 generic.go:334] "Generic (PLEG): container finished" podID="a9f00240-53f7-417a-9403-112cb396c30f" containerID="f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562" exitCode=0 Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.797923 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerDied","Data":"f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562"} Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.797947 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f00240-53f7-417a-9403-112cb396c30f","Type":"ContainerDied","Data":"c9e74ff95a2e2aebc531245f00b55d46376011c5dbfc60a40394eb657fc82328"} Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.797963 4751 scope.go:117] "RemoveContainer" containerID="e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.798007 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.831188 4751 scope.go:117] "RemoveContainer" containerID="e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.843585 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.843616 4751 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.843626 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f00240-53f7-417a-9403-112cb396c30f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.843634 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj5cc\" (UniqueName: \"kubernetes.io/projected/a9f00240-53f7-417a-9403-112cb396c30f-kube-api-access-lj5cc\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.871393 4751 scope.go:117] "RemoveContainer" containerID="f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.891543 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9f00240-53f7-417a-9403-112cb396c30f" (UID: "a9f00240-53f7-417a-9403-112cb396c30f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.907587 4751 scope.go:117] "RemoveContainer" containerID="59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.926425 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-config-data" (OuterVolumeSpecName: "config-data") pod "a9f00240-53f7-417a-9403-112cb396c30f" (UID: "a9f00240-53f7-417a-9403-112cb396c30f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.945451 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.945491 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f00240-53f7-417a-9403-112cb396c30f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.957109 4751 scope.go:117] "RemoveContainer" containerID="e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40" Dec 03 14:38:34 crc kubenswrapper[4751]: E1203 14:38:34.958134 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40\": container with ID starting with e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40 not found: ID does not exist" containerID="e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.958207 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40"} err="failed to get container status \"e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40\": rpc error: code = NotFound desc = could not find container \"e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40\": container with ID starting with e18f4b082bed69fa32bb02d4bb5ba672a38457262e78969ced37986923174f40 not found: ID does not exist" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.958261 4751 scope.go:117] "RemoveContainer" containerID="e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2" Dec 03 14:38:34 crc kubenswrapper[4751]: E1203 14:38:34.958926 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2\": container with ID starting with e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2 not found: ID does not exist" containerID="e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.958966 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2"} err="failed to get container status \"e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2\": rpc error: code = NotFound desc = could not find container \"e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2\": container with ID starting with e80516920620e85189f41b0187e811d4afb78d0b1069cc32925dff381ea420e2 not found: ID does not exist" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.958993 4751 scope.go:117] "RemoveContainer" containerID="f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562" Dec 03 14:38:34 crc kubenswrapper[4751]: E1203 14:38:34.961718 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562\": container with ID starting with f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562 not found: ID does not exist" containerID="f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.961801 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562"} err="failed to get container status \"f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562\": rpc error: code = NotFound desc = could not find container \"f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562\": container with ID starting with f82b1ad2e53c84f607abc2d7b971da8c649a376bb7eff688caf503be547a9562 not found: ID does not exist" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.961866 4751 scope.go:117] "RemoveContainer" containerID="59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290" Dec 03 14:38:34 crc kubenswrapper[4751]: E1203 14:38:34.963047 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290\": container with ID starting with 59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290 not found: ID does not exist" containerID="59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290" Dec 03 14:38:34 crc kubenswrapper[4751]: I1203 14:38:34.963090 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290"} err="failed to get container status \"59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290\": rpc error: code = NotFound desc = could not find container \"59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290\": container with ID starting with 59b012f07e2d23c2347b9c40b118fbf7ee292515a7834146b402892452c0f290 not found: ID does not exist" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.136312 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.146988 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.167195 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:35 crc kubenswrapper[4751]: E1203 14:38:35.167796 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="proxy-httpd" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.167839 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="proxy-httpd" Dec 03 14:38:35 crc kubenswrapper[4751]: E1203 14:38:35.167862 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="ceilometer-notification-agent" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.167873 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="ceilometer-notification-agent" Dec 03 14:38:35 crc kubenswrapper[4751]: E1203 14:38:35.167895 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="ceilometer-central-agent" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.167905 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="ceilometer-central-agent" Dec 03 14:38:35 crc kubenswrapper[4751]: E1203 14:38:35.167932 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="sg-core" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.167939 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="sg-core" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.168179 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="ceilometer-notification-agent" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.168209 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="sg-core" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.168228 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="ceilometer-central-agent" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.168249 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f00240-53f7-417a-9403-112cb396c30f" containerName="proxy-httpd" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.170411 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.172715 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.173895 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.174518 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.181520 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.252130 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-scripts\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.252174 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.252230 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-config-data\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.252377 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-run-httpd\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.252543 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-log-httpd\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.252604 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb6wt\" (UniqueName: \"kubernetes.io/projected/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-kube-api-access-zb6wt\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.252629 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.252723 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.326026 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f00240-53f7-417a-9403-112cb396c30f" path="/var/lib/kubelet/pods/a9f00240-53f7-417a-9403-112cb396c30f/volumes" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.354831 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-scripts\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.354902 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.354993 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-config-data\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.355062 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-run-httpd\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.355146 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-log-httpd\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.355175 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb6wt\" (UniqueName: \"kubernetes.io/projected/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-kube-api-access-zb6wt\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.355196 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.355240 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.356570 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-log-httpd\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.357088 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-run-httpd\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.360405 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.360995 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-config-data\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.361306 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.361474 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-scripts\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.362869 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.373754 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb6wt\" (UniqueName: \"kubernetes.io/projected/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-kube-api-access-zb6wt\") pod \"ceilometer-0\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.497173 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.837170 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2","Type":"ContainerStarted","Data":"c2db9a5aea35bc01ffcfb148ca1630e91b52e15343f75baef0d527607442d514"} Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.837474 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 14:38:35 crc kubenswrapper[4751]: I1203 14:38:35.869651 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.190346814 podStartE2EDuration="9.869628403s" podCreationTimestamp="2025-12-03 14:38:26 +0000 UTC" firstStartedPulling="2025-12-03 14:38:27.631152759 +0000 UTC m=+1514.619507976" lastFinishedPulling="2025-12-03 14:38:34.310434348 +0000 UTC m=+1521.298789565" observedRunningTime="2025-12-03 14:38:35.857321077 +0000 UTC m=+1522.845676294" watchObservedRunningTime="2025-12-03 14:38:35.869628403 +0000 UTC m=+1522.857983620" Dec 03 14:38:36 crc kubenswrapper[4751]: I1203 14:38:36.245605 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:36 crc kubenswrapper[4751]: W1203 14:38:36.254608 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a05dd80_a31a_4a74_8deb_5b4c9b83a753.slice/crio-2e4275c0fbb9e5162896eca47631a046b03ad2651f6e79eaa938940c0a5b0ea5 WatchSource:0}: Error finding container 2e4275c0fbb9e5162896eca47631a046b03ad2651f6e79eaa938940c0a5b0ea5: Status 404 returned error can't find the container with id 2e4275c0fbb9e5162896eca47631a046b03ad2651f6e79eaa938940c0a5b0ea5 Dec 03 14:38:36 crc kubenswrapper[4751]: I1203 14:38:36.848063 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerStarted","Data":"2e4275c0fbb9e5162896eca47631a046b03ad2651f6e79eaa938940c0a5b0ea5"} Dec 03 14:38:37 crc kubenswrapper[4751]: I1203 14:38:37.804855 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 14:38:37 crc kubenswrapper[4751]: I1203 14:38:37.814664 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 14:38:37 crc kubenswrapper[4751]: I1203 14:38:37.815035 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 14:38:37 crc kubenswrapper[4751]: I1203 14:38:37.859984 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerStarted","Data":"cb7f0c4f04b98f1d94e35bcc9baa565f7f1bfaf8c7319e2f821a1ce622a4c550"} Dec 03 14:38:37 crc kubenswrapper[4751]: I1203 14:38:37.860044 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerStarted","Data":"d4f1e3d33d753a3a77a8c3b2a3e6af80d3815294b13d09bb9269c415804a0819"} Dec 03 14:38:37 crc kubenswrapper[4751]: I1203 14:38:37.869576 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 14:38:38 crc kubenswrapper[4751]: I1203 14:38:38.887781 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerStarted","Data":"3b16df004703b8d288b5b0ef0fcb4fab7dfa7538917de242409744be5acd3229"} Dec 03 14:38:39 crc kubenswrapper[4751]: I1203 14:38:39.908828 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:39 crc kubenswrapper[4751]: I1203 14:38:39.914181 4751 generic.go:334] "Generic (PLEG): container finished" podID="2cff73d8-de68-4c3c-9784-15f5ac91acae" containerID="e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883" exitCode=137 Dec 03 14:38:39 crc kubenswrapper[4751]: I1203 14:38:39.914246 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:39 crc kubenswrapper[4751]: I1203 14:38:39.914336 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2cff73d8-de68-4c3c-9784-15f5ac91acae","Type":"ContainerDied","Data":"e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883"} Dec 03 14:38:39 crc kubenswrapper[4751]: I1203 14:38:39.914371 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2cff73d8-de68-4c3c-9784-15f5ac91acae","Type":"ContainerDied","Data":"cec2e953087a184ce1c1099a530e60847655fb52a2cae5f8caebd10a4b253728"} Dec 03 14:38:39 crc kubenswrapper[4751]: I1203 14:38:39.914532 4751 scope.go:117] "RemoveContainer" containerID="e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883" Dec 03 14:38:39 crc kubenswrapper[4751]: I1203 14:38:39.974806 4751 scope.go:117] "RemoveContainer" containerID="e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883" Dec 03 14:38:39 crc kubenswrapper[4751]: E1203 14:38:39.975869 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883\": container with ID starting with e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883 not found: ID does not exist" containerID="e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883" Dec 03 14:38:39 crc kubenswrapper[4751]: I1203 14:38:39.975903 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883"} err="failed to get container status \"e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883\": rpc error: code = NotFound desc = could not find container \"e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883\": container with ID starting with e88e7315f3ff98ccb6ed19c00cefc421079d511b3d2d93213c0529db0ab18883 not found: ID does not exist" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.085989 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-config-data\") pod \"2cff73d8-de68-4c3c-9784-15f5ac91acae\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.086609 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-combined-ca-bundle\") pod \"2cff73d8-de68-4c3c-9784-15f5ac91acae\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.086747 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwlqx\" (UniqueName: \"kubernetes.io/projected/2cff73d8-de68-4c3c-9784-15f5ac91acae-kube-api-access-hwlqx\") pod \"2cff73d8-de68-4c3c-9784-15f5ac91acae\" (UID: \"2cff73d8-de68-4c3c-9784-15f5ac91acae\") " Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.091269 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cff73d8-de68-4c3c-9784-15f5ac91acae-kube-api-access-hwlqx" (OuterVolumeSpecName: "kube-api-access-hwlqx") pod "2cff73d8-de68-4c3c-9784-15f5ac91acae" (UID: "2cff73d8-de68-4c3c-9784-15f5ac91acae"). InnerVolumeSpecName "kube-api-access-hwlqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.118954 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-config-data" (OuterVolumeSpecName: "config-data") pod "2cff73d8-de68-4c3c-9784-15f5ac91acae" (UID: "2cff73d8-de68-4c3c-9784-15f5ac91acae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.121548 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cff73d8-de68-4c3c-9784-15f5ac91acae" (UID: "2cff73d8-de68-4c3c-9784-15f5ac91acae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.189824 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwlqx\" (UniqueName: \"kubernetes.io/projected/2cff73d8-de68-4c3c-9784-15f5ac91acae-kube-api-access-hwlqx\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.189861 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.189870 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cff73d8-de68-4c3c-9784-15f5ac91acae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.256485 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.268162 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.279159 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:38:40 crc kubenswrapper[4751]: E1203 14:38:40.279599 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cff73d8-de68-4c3c-9784-15f5ac91acae" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.279618 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cff73d8-de68-4c3c-9784-15f5ac91acae" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.279819 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cff73d8-de68-4c3c-9784-15f5ac91acae" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.280543 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.282478 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.282717 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.282862 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.290007 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.392768 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.392809 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl5ds\" (UniqueName: \"kubernetes.io/projected/fa35ea33-1dc0-4569-9052-36e722f491c1-kube-api-access-dl5ds\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.393097 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.393174 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.393264 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.495387 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.495431 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl5ds\" (UniqueName: \"kubernetes.io/projected/fa35ea33-1dc0-4569-9052-36e722f491c1-kube-api-access-dl5ds\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.495567 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.495594 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.495640 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.503496 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.503608 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.503892 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.515759 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa35ea33-1dc0-4569-9052-36e722f491c1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.517785 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl5ds\" (UniqueName: \"kubernetes.io/projected/fa35ea33-1dc0-4569-9052-36e722f491c1-kube-api-access-dl5ds\") pod \"nova-cell1-novncproxy-0\" (UID: \"fa35ea33-1dc0-4569-9052-36e722f491c1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.650878 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.944794 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerStarted","Data":"1dcb069208003e9fef92144f90dead1f500e574d6c49e8a9f0fcdd197a035ec7"} Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.945230 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:38:40 crc kubenswrapper[4751]: I1203 14:38:40.983550 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.545082951 podStartE2EDuration="5.9835266s" podCreationTimestamp="2025-12-03 14:38:35 +0000 UTC" firstStartedPulling="2025-12-03 14:38:36.256983977 +0000 UTC m=+1523.245339194" lastFinishedPulling="2025-12-03 14:38:39.695427636 +0000 UTC m=+1526.683782843" observedRunningTime="2025-12-03 14:38:40.967746881 +0000 UTC m=+1527.956102158" watchObservedRunningTime="2025-12-03 14:38:40.9835266 +0000 UTC m=+1527.971881817" Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.132397 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:38:41 crc kubenswrapper[4751]: W1203 14:38:41.143987 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa35ea33_1dc0_4569_9052_36e722f491c1.slice/crio-3adcbb3ef3505d6c9ac45982a66a05755ed6da22ec483c4cf93b72c6f4072170 WatchSource:0}: Error finding container 3adcbb3ef3505d6c9ac45982a66a05755ed6da22ec483c4cf93b72c6f4072170: Status 404 returned error can't find the container with id 3adcbb3ef3505d6c9ac45982a66a05755ed6da22ec483c4cf93b72c6f4072170 Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.326825 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cff73d8-de68-4c3c-9784-15f5ac91acae" path="/var/lib/kubelet/pods/2cff73d8-de68-4c3c-9784-15f5ac91acae/volumes" Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.910931 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.911608 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.913695 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.914102 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.959086 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fa35ea33-1dc0-4569-9052-36e722f491c1","Type":"ContainerStarted","Data":"b803f429d3946ab068900d497e2979f87ec04f0796be774e9692299640e0b84b"} Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.959124 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.959136 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fa35ea33-1dc0-4569-9052-36e722f491c1","Type":"ContainerStarted","Data":"3adcbb3ef3505d6c9ac45982a66a05755ed6da22ec483c4cf93b72c6f4072170"} Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.968669 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 14:38:41 crc kubenswrapper[4751]: I1203 14:38:41.985185 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.9851598560000001 podStartE2EDuration="1.985159856s" podCreationTimestamp="2025-12-03 14:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:41.981585172 +0000 UTC m=+1528.969940429" watchObservedRunningTime="2025-12-03 14:38:41.985159856 +0000 UTC m=+1528.973515073" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.244172 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54dd998c-glbss"] Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.246977 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.299905 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-glbss"] Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.402607 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-nb\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.402698 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzfz6\" (UniqueName: \"kubernetes.io/projected/8481969b-2092-4b8c-9a57-9f83972d0997-kube-api-access-mzfz6\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.402744 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-swift-storage-0\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.402770 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-sb\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.402805 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-svc\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.402844 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-config\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.505068 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-config\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.505171 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-nb\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.505303 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzfz6\" (UniqueName: \"kubernetes.io/projected/8481969b-2092-4b8c-9a57-9f83972d0997-kube-api-access-mzfz6\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.505409 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-swift-storage-0\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.505442 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-sb\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.505486 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-svc\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.506802 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-config\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.507687 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-nb\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.510640 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-sb\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.511136 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-swift-storage-0\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.511277 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-svc\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.527684 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzfz6\" (UniqueName: \"kubernetes.io/projected/8481969b-2092-4b8c-9a57-9f83972d0997-kube-api-access-mzfz6\") pod \"dnsmasq-dns-54dd998c-glbss\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:42 crc kubenswrapper[4751]: I1203 14:38:42.659348 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:43 crc kubenswrapper[4751]: W1203 14:38:43.215738 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8481969b_2092_4b8c_9a57_9f83972d0997.slice/crio-fd343a6666b1fb43ca535ad571a21407d8ce81bda3f667bd00a7b665045cd773 WatchSource:0}: Error finding container fd343a6666b1fb43ca535ad571a21407d8ce81bda3f667bd00a7b665045cd773: Status 404 returned error can't find the container with id fd343a6666b1fb43ca535ad571a21407d8ce81bda3f667bd00a7b665045cd773 Dec 03 14:38:43 crc kubenswrapper[4751]: I1203 14:38:43.219497 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-glbss"] Dec 03 14:38:43 crc kubenswrapper[4751]: I1203 14:38:43.977517 4751 generic.go:334] "Generic (PLEG): container finished" podID="8481969b-2092-4b8c-9a57-9f83972d0997" containerID="6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca" exitCode=0 Dec 03 14:38:43 crc kubenswrapper[4751]: I1203 14:38:43.977723 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-glbss" event={"ID":"8481969b-2092-4b8c-9a57-9f83972d0997","Type":"ContainerDied","Data":"6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca"} Dec 03 14:38:43 crc kubenswrapper[4751]: I1203 14:38:43.978041 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-glbss" event={"ID":"8481969b-2092-4b8c-9a57-9f83972d0997","Type":"ContainerStarted","Data":"fd343a6666b1fb43ca535ad571a21407d8ce81bda3f667bd00a7b665045cd773"} Dec 03 14:38:44 crc kubenswrapper[4751]: I1203 14:38:44.992189 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-glbss" event={"ID":"8481969b-2092-4b8c-9a57-9f83972d0997","Type":"ContainerStarted","Data":"6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c"} Dec 03 14:38:44 crc kubenswrapper[4751]: I1203 14:38:44.992886 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.012731 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54dd998c-glbss" podStartSLOduration=3.012707057 podStartE2EDuration="3.012707057s" podCreationTimestamp="2025-12-03 14:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:45.008105635 +0000 UTC m=+1531.996460862" watchObservedRunningTime="2025-12-03 14:38:45.012707057 +0000 UTC m=+1532.001062274" Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.152781 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.153151 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="sg-core" containerID="cri-o://3b16df004703b8d288b5b0ef0fcb4fab7dfa7538917de242409744be5acd3229" gracePeriod=30 Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.153181 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="ceilometer-central-agent" containerID="cri-o://d4f1e3d33d753a3a77a8c3b2a3e6af80d3815294b13d09bb9269c415804a0819" gracePeriod=30 Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.153149 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="ceilometer-notification-agent" containerID="cri-o://cb7f0c4f04b98f1d94e35bcc9baa565f7f1bfaf8c7319e2f821a1ce622a4c550" gracePeriod=30 Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.153220 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="proxy-httpd" containerID="cri-o://1dcb069208003e9fef92144f90dead1f500e574d6c49e8a9f0fcdd197a035ec7" gracePeriod=30 Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.533738 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.534798 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-log" containerID="cri-o://63ec04aa3f42c0b0a0b61777c74dad59cb60e6f1b7df9eca9b5a55e8f07b4a47" gracePeriod=30 Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.534955 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-api" containerID="cri-o://a7ca56ff46023a46ef741ce361b122aaa0fada509af6e5bf61dabcba9cc2c844" gracePeriod=30 Dec 03 14:38:45 crc kubenswrapper[4751]: I1203 14:38:45.651777 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.011334 4751 generic.go:334] "Generic (PLEG): container finished" podID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerID="1dcb069208003e9fef92144f90dead1f500e574d6c49e8a9f0fcdd197a035ec7" exitCode=0 Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.011640 4751 generic.go:334] "Generic (PLEG): container finished" podID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerID="3b16df004703b8d288b5b0ef0fcb4fab7dfa7538917de242409744be5acd3229" exitCode=2 Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.011366 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerDied","Data":"1dcb069208003e9fef92144f90dead1f500e574d6c49e8a9f0fcdd197a035ec7"} Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.011649 4751 generic.go:334] "Generic (PLEG): container finished" podID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerID="cb7f0c4f04b98f1d94e35bcc9baa565f7f1bfaf8c7319e2f821a1ce622a4c550" exitCode=0 Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.011670 4751 generic.go:334] "Generic (PLEG): container finished" podID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerID="d4f1e3d33d753a3a77a8c3b2a3e6af80d3815294b13d09bb9269c415804a0819" exitCode=0 Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.011686 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerDied","Data":"3b16df004703b8d288b5b0ef0fcb4fab7dfa7538917de242409744be5acd3229"} Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.011703 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerDied","Data":"cb7f0c4f04b98f1d94e35bcc9baa565f7f1bfaf8c7319e2f821a1ce622a4c550"} Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.011713 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerDied","Data":"d4f1e3d33d753a3a77a8c3b2a3e6af80d3815294b13d09bb9269c415804a0819"} Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.014901 4751 generic.go:334] "Generic (PLEG): container finished" podID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerID="63ec04aa3f42c0b0a0b61777c74dad59cb60e6f1b7df9eca9b5a55e8f07b4a47" exitCode=143 Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.015519 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4c34aee-0ee0-45f5-b2ae-b87b966366ea","Type":"ContainerDied","Data":"63ec04aa3f42c0b0a0b61777c74dad59cb60e6f1b7df9eca9b5a55e8f07b4a47"} Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.213731 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.291794 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-combined-ca-bundle\") pod \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.292281 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-scripts\") pod \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.292388 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-run-httpd\") pod \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.292434 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-config-data\") pod \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.292508 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb6wt\" (UniqueName: \"kubernetes.io/projected/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-kube-api-access-zb6wt\") pod \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.292581 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-ceilometer-tls-certs\") pod \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.292973 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2a05dd80-a31a-4a74-8deb-5b4c9b83a753" (UID: "2a05dd80-a31a-4a74-8deb-5b4c9b83a753"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.293442 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-log-httpd\") pod \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.293505 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-sg-core-conf-yaml\") pod \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\" (UID: \"2a05dd80-a31a-4a74-8deb-5b4c9b83a753\") " Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.293869 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2a05dd80-a31a-4a74-8deb-5b4c9b83a753" (UID: "2a05dd80-a31a-4a74-8deb-5b4c9b83a753"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.294400 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.294419 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.312673 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-kube-api-access-zb6wt" (OuterVolumeSpecName: "kube-api-access-zb6wt") pod "2a05dd80-a31a-4a74-8deb-5b4c9b83a753" (UID: "2a05dd80-a31a-4a74-8deb-5b4c9b83a753"). InnerVolumeSpecName "kube-api-access-zb6wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.325804 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-scripts" (OuterVolumeSpecName: "scripts") pod "2a05dd80-a31a-4a74-8deb-5b4c9b83a753" (UID: "2a05dd80-a31a-4a74-8deb-5b4c9b83a753"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.382900 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2a05dd80-a31a-4a74-8deb-5b4c9b83a753" (UID: "2a05dd80-a31a-4a74-8deb-5b4c9b83a753"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.389105 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2a05dd80-a31a-4a74-8deb-5b4c9b83a753" (UID: "2a05dd80-a31a-4a74-8deb-5b4c9b83a753"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.396735 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.396767 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb6wt\" (UniqueName: \"kubernetes.io/projected/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-kube-api-access-zb6wt\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.396780 4751 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.396790 4751 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.434634 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a05dd80-a31a-4a74-8deb-5b4c9b83a753" (UID: "2a05dd80-a31a-4a74-8deb-5b4c9b83a753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.439397 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-config-data" (OuterVolumeSpecName: "config-data") pod "2a05dd80-a31a-4a74-8deb-5b4c9b83a753" (UID: "2a05dd80-a31a-4a74-8deb-5b4c9b83a753"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.498644 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:46 crc kubenswrapper[4751]: I1203 14:38:46.498690 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a05dd80-a31a-4a74-8deb-5b4c9b83a753-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.001955 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.032800 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a05dd80-a31a-4a74-8deb-5b4c9b83a753","Type":"ContainerDied","Data":"2e4275c0fbb9e5162896eca47631a046b03ad2651f6e79eaa938940c0a5b0ea5"} Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.032850 4751 scope.go:117] "RemoveContainer" containerID="1dcb069208003e9fef92144f90dead1f500e574d6c49e8a9f0fcdd197a035ec7" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.032891 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.071929 4751 scope.go:117] "RemoveContainer" containerID="3b16df004703b8d288b5b0ef0fcb4fab7dfa7538917de242409744be5acd3229" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.090028 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.106964 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.122722 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:47 crc kubenswrapper[4751]: E1203 14:38:47.123290 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="ceilometer-notification-agent" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.123311 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="ceilometer-notification-agent" Dec 03 14:38:47 crc kubenswrapper[4751]: E1203 14:38:47.123421 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="sg-core" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.123437 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="sg-core" Dec 03 14:38:47 crc kubenswrapper[4751]: E1203 14:38:47.123464 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="proxy-httpd" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.123472 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="proxy-httpd" Dec 03 14:38:47 crc kubenswrapper[4751]: E1203 14:38:47.123490 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="ceilometer-central-agent" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.123498 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="ceilometer-central-agent" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.123769 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="ceilometer-central-agent" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.123791 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="ceilometer-notification-agent" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.123804 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="proxy-httpd" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.123818 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" containerName="sg-core" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.124913 4751 scope.go:117] "RemoveContainer" containerID="cb7f0c4f04b98f1d94e35bcc9baa565f7f1bfaf8c7319e2f821a1ce622a4c550" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.126686 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.130629 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.130793 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.131007 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.141973 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.168528 4751 scope.go:117] "RemoveContainer" containerID="d4f1e3d33d753a3a77a8c3b2a3e6af80d3815294b13d09bb9269c415804a0819" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.215276 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cklw6\" (UniqueName: \"kubernetes.io/projected/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-kube-api-access-cklw6\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.215358 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.215412 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-run-httpd\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.215447 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.215560 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.215609 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-config-data\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.215700 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-scripts\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.215721 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-log-httpd\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.317944 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.318004 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-config-data\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.318061 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-scripts\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.318081 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-log-httpd\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.318155 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cklw6\" (UniqueName: \"kubernetes.io/projected/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-kube-api-access-cklw6\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.318194 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.318234 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-run-httpd\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.318271 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.318944 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-log-httpd\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.319148 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-run-httpd\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.324819 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.325132 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.325345 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-config-data\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.326800 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.329721 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-scripts\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.332134 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a05dd80-a31a-4a74-8deb-5b4c9b83a753" path="/var/lib/kubelet/pods/2a05dd80-a31a-4a74-8deb-5b4c9b83a753/volumes" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.350557 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cklw6\" (UniqueName: \"kubernetes.io/projected/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-kube-api-access-cklw6\") pod \"ceilometer-0\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.459055 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.810186 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:47 crc kubenswrapper[4751]: I1203 14:38:47.973388 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:48 crc kubenswrapper[4751]: I1203 14:38:48.051264 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerStarted","Data":"99e4d3913c037deea5003d280154aa9f706b6d1f8b388efccab9a2a350631a81"} Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.085704 4751 generic.go:334] "Generic (PLEG): container finished" podID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerID="a7ca56ff46023a46ef741ce361b122aaa0fada509af6e5bf61dabcba9cc2c844" exitCode=0 Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.086078 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4c34aee-0ee0-45f5-b2ae-b87b966366ea","Type":"ContainerDied","Data":"a7ca56ff46023a46ef741ce361b122aaa0fada509af6e5bf61dabcba9cc2c844"} Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.088370 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerStarted","Data":"426def791113e7f82f15c9cbf047c7732046965a4b51fe192603a2521bfc9503"} Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.218801 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.268431 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-combined-ca-bundle\") pod \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.268523 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-config-data\") pod \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.268580 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-logs\") pod \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.268622 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58vc5\" (UniqueName: \"kubernetes.io/projected/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-kube-api-access-58vc5\") pod \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.270465 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-logs" (OuterVolumeSpecName: "logs") pod "c4c34aee-0ee0-45f5-b2ae-b87b966366ea" (UID: "c4c34aee-0ee0-45f5-b2ae-b87b966366ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.275622 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-kube-api-access-58vc5" (OuterVolumeSpecName: "kube-api-access-58vc5") pod "c4c34aee-0ee0-45f5-b2ae-b87b966366ea" (UID: "c4c34aee-0ee0-45f5-b2ae-b87b966366ea"). InnerVolumeSpecName "kube-api-access-58vc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.326581 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-config-data" (OuterVolumeSpecName: "config-data") pod "c4c34aee-0ee0-45f5-b2ae-b87b966366ea" (UID: "c4c34aee-0ee0-45f5-b2ae-b87b966366ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.380115 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4c34aee-0ee0-45f5-b2ae-b87b966366ea" (UID: "c4c34aee-0ee0-45f5-b2ae-b87b966366ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.380277 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-combined-ca-bundle\") pod \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\" (UID: \"c4c34aee-0ee0-45f5-b2ae-b87b966366ea\") " Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.381001 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.381026 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.381042 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58vc5\" (UniqueName: \"kubernetes.io/projected/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-kube-api-access-58vc5\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:49 crc kubenswrapper[4751]: W1203 14:38:49.381130 4751 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c4c34aee-0ee0-45f5-b2ae-b87b966366ea/volumes/kubernetes.io~secret/combined-ca-bundle Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.381143 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4c34aee-0ee0-45f5-b2ae-b87b966366ea" (UID: "c4c34aee-0ee0-45f5-b2ae-b87b966366ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:49 crc kubenswrapper[4751]: I1203 14:38:49.483682 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c34aee-0ee0-45f5-b2ae-b87b966366ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.104086 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4c34aee-0ee0-45f5-b2ae-b87b966366ea","Type":"ContainerDied","Data":"b2fae7995a401c633ed02c22c558125aac8677667e561f3a73b2a350d6665418"} Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.104156 4751 scope.go:117] "RemoveContainer" containerID="a7ca56ff46023a46ef741ce361b122aaa0fada509af6e5bf61dabcba9cc2c844" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.104099 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.108109 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerStarted","Data":"9b767235631dce9cdc3c316440d216c56ec79940e4fcf58d759f732d0d5f5ddc"} Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.133135 4751 scope.go:117] "RemoveContainer" containerID="63ec04aa3f42c0b0a0b61777c74dad59cb60e6f1b7df9eca9b5a55e8f07b4a47" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.160708 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.190779 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.218921 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:50 crc kubenswrapper[4751]: E1203 14:38:50.219419 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-api" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.219438 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-api" Dec 03 14:38:50 crc kubenswrapper[4751]: E1203 14:38:50.219483 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-log" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.219490 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-log" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.219666 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-api" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.219691 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" containerName="nova-api-log" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.221018 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.223242 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.223865 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.227446 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.236757 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.300139 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.300259 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw4jw\" (UniqueName: \"kubernetes.io/projected/aba7b141-640d-4504-bf82-90d304479537-kube-api-access-bw4jw\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.300368 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-public-tls-certs\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.300443 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.300474 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-config-data\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.300605 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aba7b141-640d-4504-bf82-90d304479537-logs\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.402388 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.402499 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw4jw\" (UniqueName: \"kubernetes.io/projected/aba7b141-640d-4504-bf82-90d304479537-kube-api-access-bw4jw\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.402598 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-public-tls-certs\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.402715 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.402750 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-config-data\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.402771 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aba7b141-640d-4504-bf82-90d304479537-logs\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.403533 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aba7b141-640d-4504-bf82-90d304479537-logs\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.407623 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-public-tls-certs\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.415161 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.422153 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-config-data\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.426981 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw4jw\" (UniqueName: \"kubernetes.io/projected/aba7b141-640d-4504-bf82-90d304479537-kube-api-access-bw4jw\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.426978 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.545756 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.652061 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:50 crc kubenswrapper[4751]: I1203 14:38:50.679268 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.061687 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.133187 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aba7b141-640d-4504-bf82-90d304479537","Type":"ContainerStarted","Data":"e1ee653e4568461cd4d08901161a835a2733d90d5d427bdb37c3b601200e2253"} Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.138735 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerStarted","Data":"73ce6034792354e94f3dcd543255d0517330e1e13b4632ba3ce2b3f57fd1ca07"} Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.159684 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.331062 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4c34aee-0ee0-45f5-b2ae-b87b966366ea" path="/var/lib/kubelet/pods/c4c34aee-0ee0-45f5-b2ae-b87b966366ea/volumes" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.418242 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fpjws"] Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.428119 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.437359 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.437614 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.440018 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-scripts\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.440108 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfj2t\" (UniqueName: \"kubernetes.io/projected/12c1f3f2-32ce-4652-8962-99c0d111d953-kube-api-access-nfj2t\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.440206 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.440235 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-config-data\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.446933 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fpjws"] Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.541843 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-scripts\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.541950 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfj2t\" (UniqueName: \"kubernetes.io/projected/12c1f3f2-32ce-4652-8962-99c0d111d953-kube-api-access-nfj2t\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.542059 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.542114 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-config-data\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.549050 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.549155 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-config-data\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.549768 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-scripts\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.561184 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfj2t\" (UniqueName: \"kubernetes.io/projected/12c1f3f2-32ce-4652-8962-99c0d111d953-kube-api-access-nfj2t\") pod \"nova-cell1-cell-mapping-fpjws\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:51 crc kubenswrapper[4751]: I1203 14:38:51.701182 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.154402 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aba7b141-640d-4504-bf82-90d304479537","Type":"ContainerStarted","Data":"741bbf19c3092902ec5da2466342d3ee77cff6ec99b9d91da3be06bb213f6436"} Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.154788 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aba7b141-640d-4504-bf82-90d304479537","Type":"ContainerStarted","Data":"1cbc5a93eb80f7a9da58de63b22264b91f7544620affc2f92d7fd3c154f9b5f2"} Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.158147 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="ceilometer-central-agent" containerID="cri-o://426def791113e7f82f15c9cbf047c7732046965a4b51fe192603a2521bfc9503" gracePeriod=30 Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.158242 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerStarted","Data":"0a516c77bbdaaccc84431fc46ea6dc19618d923a60179baa3614c3f3c8a2d408"} Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.158274 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.158308 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="proxy-httpd" containerID="cri-o://0a516c77bbdaaccc84431fc46ea6dc19618d923a60179baa3614c3f3c8a2d408" gracePeriod=30 Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.158369 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="sg-core" containerID="cri-o://73ce6034792354e94f3dcd543255d0517330e1e13b4632ba3ce2b3f57fd1ca07" gracePeriod=30 Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.158408 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="ceilometer-notification-agent" containerID="cri-o://9b767235631dce9cdc3c316440d216c56ec79940e4fcf58d759f732d0d5f5ddc" gracePeriod=30 Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.194393 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.194369827 podStartE2EDuration="2.194369827s" podCreationTimestamp="2025-12-03 14:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:52.189104008 +0000 UTC m=+1539.177459235" watchObservedRunningTime="2025-12-03 14:38:52.194369827 +0000 UTC m=+1539.182725054" Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.234160 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.864356975 podStartE2EDuration="5.234142992s" podCreationTimestamp="2025-12-03 14:38:47 +0000 UTC" firstStartedPulling="2025-12-03 14:38:47.969951122 +0000 UTC m=+1534.958306339" lastFinishedPulling="2025-12-03 14:38:51.339737139 +0000 UTC m=+1538.328092356" observedRunningTime="2025-12-03 14:38:52.224153757 +0000 UTC m=+1539.212508994" watchObservedRunningTime="2025-12-03 14:38:52.234142992 +0000 UTC m=+1539.222498209" Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.242536 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fpjws"] Dec 03 14:38:52 crc kubenswrapper[4751]: W1203 14:38:52.244650 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12c1f3f2_32ce_4652_8962_99c0d111d953.slice/crio-9ad73af35d64915ac0de5cffbb38be2e7d6a01f7bceb11ebff5afc556da6c427 WatchSource:0}: Error finding container 9ad73af35d64915ac0de5cffbb38be2e7d6a01f7bceb11ebff5afc556da6c427: Status 404 returned error can't find the container with id 9ad73af35d64915ac0de5cffbb38be2e7d6a01f7bceb11ebff5afc556da6c427 Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.660509 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.740892 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-5szns"] Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.741129 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" podUID="267c2a24-af2e-48c4-9101-fbf9bba26e67" containerName="dnsmasq-dns" containerID="cri-o://a73accb674ecb494bc063d0cd0b4ea9e5abbb9f44b3344cc9a13110378ba6e51" gracePeriod=10 Dec 03 14:38:52 crc kubenswrapper[4751]: I1203 14:38:52.856918 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" podUID="267c2a24-af2e-48c4-9101-fbf9bba26e67" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.217:5353: connect: connection refused" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.185590 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fpjws" event={"ID":"12c1f3f2-32ce-4652-8962-99c0d111d953","Type":"ContainerStarted","Data":"450cb5444d1d441152daf714ab2046fad875605c51ef348e83b3599b8c6f3dc4"} Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.185900 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fpjws" event={"ID":"12c1f3f2-32ce-4652-8962-99c0d111d953","Type":"ContainerStarted","Data":"9ad73af35d64915ac0de5cffbb38be2e7d6a01f7bceb11ebff5afc556da6c427"} Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.204622 4751 generic.go:334] "Generic (PLEG): container finished" podID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerID="0a516c77bbdaaccc84431fc46ea6dc19618d923a60179baa3614c3f3c8a2d408" exitCode=0 Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.204663 4751 generic.go:334] "Generic (PLEG): container finished" podID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerID="73ce6034792354e94f3dcd543255d0517330e1e13b4632ba3ce2b3f57fd1ca07" exitCode=2 Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.204672 4751 generic.go:334] "Generic (PLEG): container finished" podID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerID="9b767235631dce9cdc3c316440d216c56ec79940e4fcf58d759f732d0d5f5ddc" exitCode=0 Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.204740 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerDied","Data":"0a516c77bbdaaccc84431fc46ea6dc19618d923a60179baa3614c3f3c8a2d408"} Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.204774 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerDied","Data":"73ce6034792354e94f3dcd543255d0517330e1e13b4632ba3ce2b3f57fd1ca07"} Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.204783 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerDied","Data":"9b767235631dce9cdc3c316440d216c56ec79940e4fcf58d759f732d0d5f5ddc"} Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.216361 4751 generic.go:334] "Generic (PLEG): container finished" podID="267c2a24-af2e-48c4-9101-fbf9bba26e67" containerID="a73accb674ecb494bc063d0cd0b4ea9e5abbb9f44b3344cc9a13110378ba6e51" exitCode=0 Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.217752 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" event={"ID":"267c2a24-af2e-48c4-9101-fbf9bba26e67","Type":"ContainerDied","Data":"a73accb674ecb494bc063d0cd0b4ea9e5abbb9f44b3344cc9a13110378ba6e51"} Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.225227 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fpjws" podStartSLOduration=2.225187398 podStartE2EDuration="2.225187398s" podCreationTimestamp="2025-12-03 14:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:38:53.209219145 +0000 UTC m=+1540.197574362" watchObservedRunningTime="2025-12-03 14:38:53.225187398 +0000 UTC m=+1540.213542605" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.487426 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.513092 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-nb\") pod \"267c2a24-af2e-48c4-9101-fbf9bba26e67\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.513169 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqt4t\" (UniqueName: \"kubernetes.io/projected/267c2a24-af2e-48c4-9101-fbf9bba26e67-kube-api-access-zqt4t\") pod \"267c2a24-af2e-48c4-9101-fbf9bba26e67\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.513314 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-sb\") pod \"267c2a24-af2e-48c4-9101-fbf9bba26e67\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.513539 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-swift-storage-0\") pod \"267c2a24-af2e-48c4-9101-fbf9bba26e67\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.513597 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-config\") pod \"267c2a24-af2e-48c4-9101-fbf9bba26e67\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.513625 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-svc\") pod \"267c2a24-af2e-48c4-9101-fbf9bba26e67\" (UID: \"267c2a24-af2e-48c4-9101-fbf9bba26e67\") " Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.528127 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267c2a24-af2e-48c4-9101-fbf9bba26e67-kube-api-access-zqt4t" (OuterVolumeSpecName: "kube-api-access-zqt4t") pod "267c2a24-af2e-48c4-9101-fbf9bba26e67" (UID: "267c2a24-af2e-48c4-9101-fbf9bba26e67"). InnerVolumeSpecName "kube-api-access-zqt4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.595974 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "267c2a24-af2e-48c4-9101-fbf9bba26e67" (UID: "267c2a24-af2e-48c4-9101-fbf9bba26e67"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.604135 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-config" (OuterVolumeSpecName: "config") pod "267c2a24-af2e-48c4-9101-fbf9bba26e67" (UID: "267c2a24-af2e-48c4-9101-fbf9bba26e67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.619661 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqt4t\" (UniqueName: \"kubernetes.io/projected/267c2a24-af2e-48c4-9101-fbf9bba26e67-kube-api-access-zqt4t\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.619960 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.620033 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.628362 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "267c2a24-af2e-48c4-9101-fbf9bba26e67" (UID: "267c2a24-af2e-48c4-9101-fbf9bba26e67"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.632834 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "267c2a24-af2e-48c4-9101-fbf9bba26e67" (UID: "267c2a24-af2e-48c4-9101-fbf9bba26e67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.640706 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "267c2a24-af2e-48c4-9101-fbf9bba26e67" (UID: "267c2a24-af2e-48c4-9101-fbf9bba26e67"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.722169 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.722596 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:53 crc kubenswrapper[4751]: I1203 14:38:53.722613 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267c2a24-af2e-48c4-9101-fbf9bba26e67-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:54 crc kubenswrapper[4751]: I1203 14:38:54.235775 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" event={"ID":"267c2a24-af2e-48c4-9101-fbf9bba26e67","Type":"ContainerDied","Data":"0ab0346052c754a8dbdec9fba3c53bd576e0b0035e2dc135e6b034f7beb35d92"} Dec 03 14:38:54 crc kubenswrapper[4751]: I1203 14:38:54.235861 4751 scope.go:117] "RemoveContainer" containerID="a73accb674ecb494bc063d0cd0b4ea9e5abbb9f44b3344cc9a13110378ba6e51" Dec 03 14:38:54 crc kubenswrapper[4751]: I1203 14:38:54.235803 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-884c8b8f5-5szns" Dec 03 14:38:54 crc kubenswrapper[4751]: I1203 14:38:54.274526 4751 scope.go:117] "RemoveContainer" containerID="5254aef6d7a3c7ec10fed09dcced6a5e88f35fb97688379f002044ccb3d7cc69" Dec 03 14:38:54 crc kubenswrapper[4751]: I1203 14:38:54.278494 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-5szns"] Dec 03 14:38:54 crc kubenswrapper[4751]: I1203 14:38:54.300767 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-5szns"] Dec 03 14:38:55 crc kubenswrapper[4751]: I1203 14:38:55.327945 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267c2a24-af2e-48c4-9101-fbf9bba26e67" path="/var/lib/kubelet/pods/267c2a24-af2e-48c4-9101-fbf9bba26e67/volumes" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.261632 4751 generic.go:334] "Generic (PLEG): container finished" podID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerID="426def791113e7f82f15c9cbf047c7732046965a4b51fe192603a2521bfc9503" exitCode=0 Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.262045 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerDied","Data":"426def791113e7f82f15c9cbf047c7732046965a4b51fe192603a2521bfc9503"} Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.262087 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e24e1b79-1ed2-423b-a93b-dee54c1d6b80","Type":"ContainerDied","Data":"99e4d3913c037deea5003d280154aa9f706b6d1f8b388efccab9a2a350631a81"} Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.262110 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99e4d3913c037deea5003d280154aa9f706b6d1f8b388efccab9a2a350631a81" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.289771 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.386519 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-log-httpd\") pod \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.386618 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-config-data\") pod \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.386690 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-combined-ca-bundle\") pod \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.386720 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-sg-core-conf-yaml\") pod \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.386759 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-run-httpd\") pod \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.386830 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-scripts\") pod \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.386913 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cklw6\" (UniqueName: \"kubernetes.io/projected/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-kube-api-access-cklw6\") pod \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.386976 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-ceilometer-tls-certs\") pod \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\" (UID: \"e24e1b79-1ed2-423b-a93b-dee54c1d6b80\") " Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.388237 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e24e1b79-1ed2-423b-a93b-dee54c1d6b80" (UID: "e24e1b79-1ed2-423b-a93b-dee54c1d6b80"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.388383 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e24e1b79-1ed2-423b-a93b-dee54c1d6b80" (UID: "e24e1b79-1ed2-423b-a93b-dee54c1d6b80"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.395438 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-kube-api-access-cklw6" (OuterVolumeSpecName: "kube-api-access-cklw6") pod "e24e1b79-1ed2-423b-a93b-dee54c1d6b80" (UID: "e24e1b79-1ed2-423b-a93b-dee54c1d6b80"). InnerVolumeSpecName "kube-api-access-cklw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.413027 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-scripts" (OuterVolumeSpecName: "scripts") pod "e24e1b79-1ed2-423b-a93b-dee54c1d6b80" (UID: "e24e1b79-1ed2-423b-a93b-dee54c1d6b80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.435680 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e24e1b79-1ed2-423b-a93b-dee54c1d6b80" (UID: "e24e1b79-1ed2-423b-a93b-dee54c1d6b80"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.464900 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e24e1b79-1ed2-423b-a93b-dee54c1d6b80" (UID: "e24e1b79-1ed2-423b-a93b-dee54c1d6b80"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.499341 4751 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.499421 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.499438 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.499450 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cklw6\" (UniqueName: \"kubernetes.io/projected/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-kube-api-access-cklw6\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.499481 4751 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.499491 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.528631 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e24e1b79-1ed2-423b-a93b-dee54c1d6b80" (UID: "e24e1b79-1ed2-423b-a93b-dee54c1d6b80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.541216 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-config-data" (OuterVolumeSpecName: "config-data") pod "e24e1b79-1ed2-423b-a93b-dee54c1d6b80" (UID: "e24e1b79-1ed2-423b-a93b-dee54c1d6b80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.601290 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:56 crc kubenswrapper[4751]: I1203 14:38:56.601350 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24e1b79-1ed2-423b-a93b-dee54c1d6b80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.271579 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.329779 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.330260 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.341343 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:57 crc kubenswrapper[4751]: E1203 14:38:57.341877 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="ceilometer-notification-agent" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.341901 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="ceilometer-notification-agent" Dec 03 14:38:57 crc kubenswrapper[4751]: E1203 14:38:57.341921 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267c2a24-af2e-48c4-9101-fbf9bba26e67" containerName="dnsmasq-dns" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.341929 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="267c2a24-af2e-48c4-9101-fbf9bba26e67" containerName="dnsmasq-dns" Dec 03 14:38:57 crc kubenswrapper[4751]: E1203 14:38:57.341958 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267c2a24-af2e-48c4-9101-fbf9bba26e67" containerName="init" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.341967 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="267c2a24-af2e-48c4-9101-fbf9bba26e67" containerName="init" Dec 03 14:38:57 crc kubenswrapper[4751]: E1203 14:38:57.341982 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="proxy-httpd" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.341990 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="proxy-httpd" Dec 03 14:38:57 crc kubenswrapper[4751]: E1203 14:38:57.342005 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="ceilometer-central-agent" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.342012 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="ceilometer-central-agent" Dec 03 14:38:57 crc kubenswrapper[4751]: E1203 14:38:57.342028 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="sg-core" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.342036 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="sg-core" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.342320 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="267c2a24-af2e-48c4-9101-fbf9bba26e67" containerName="dnsmasq-dns" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.342427 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="sg-core" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.342442 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="ceilometer-notification-agent" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.342468 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="proxy-httpd" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.342482 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" containerName="ceilometer-central-agent" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.344474 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.346501 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.346610 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.355720 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.361181 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.420802 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.420891 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-run-httpd\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.420949 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.420981 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-log-httpd\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.421166 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-scripts\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.421277 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-config-data\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.421732 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.421958 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqmhl\" (UniqueName: \"kubernetes.io/projected/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-kube-api-access-cqmhl\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.524811 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-scripts\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.524884 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-config-data\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.524987 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.525037 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqmhl\" (UniqueName: \"kubernetes.io/projected/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-kube-api-access-cqmhl\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.525088 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.525119 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-run-httpd\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.525161 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.525187 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-log-httpd\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.525804 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-run-httpd\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.525831 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-log-httpd\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.528540 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.530536 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.530894 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-scripts\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.531287 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-config-data\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.539117 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.542673 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqmhl\" (UniqueName: \"kubernetes.io/projected/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-kube-api-access-cqmhl\") pod \"ceilometer-0\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " pod="openstack/ceilometer-0" Dec 03 14:38:57 crc kubenswrapper[4751]: I1203 14:38:57.665818 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:38:58 crc kubenswrapper[4751]: W1203 14:38:58.133303 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a4cf75_e487_4f9b_989c_d6a7d6f0de30.slice/crio-16fd6e4a15234565c49006188c700e9d4a47e8982406a38eacd94b047c7a0a86 WatchSource:0}: Error finding container 16fd6e4a15234565c49006188c700e9d4a47e8982406a38eacd94b047c7a0a86: Status 404 returned error can't find the container with id 16fd6e4a15234565c49006188c700e9d4a47e8982406a38eacd94b047c7a0a86 Dec 03 14:38:58 crc kubenswrapper[4751]: I1203 14:38:58.134366 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:38:58 crc kubenswrapper[4751]: I1203 14:38:58.285263 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerStarted","Data":"16fd6e4a15234565c49006188c700e9d4a47e8982406a38eacd94b047c7a0a86"} Dec 03 14:38:58 crc kubenswrapper[4751]: I1203 14:38:58.287299 4751 generic.go:334] "Generic (PLEG): container finished" podID="12c1f3f2-32ce-4652-8962-99c0d111d953" containerID="450cb5444d1d441152daf714ab2046fad875605c51ef348e83b3599b8c6f3dc4" exitCode=0 Dec 03 14:38:58 crc kubenswrapper[4751]: I1203 14:38:58.287374 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fpjws" event={"ID":"12c1f3f2-32ce-4652-8962-99c0d111d953","Type":"ContainerDied","Data":"450cb5444d1d441152daf714ab2046fad875605c51ef348e83b3599b8c6f3dc4"} Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.312285 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerStarted","Data":"726ff0394ceadcdd06af4efc79f4a8665066f986754e42516e62c9211c9077a6"} Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.330838 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24e1b79-1ed2-423b-a93b-dee54c1d6b80" path="/var/lib/kubelet/pods/e24e1b79-1ed2-423b-a93b-dee54c1d6b80/volumes" Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.830060 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.893554 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-config-data\") pod \"12c1f3f2-32ce-4652-8962-99c0d111d953\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.893684 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-scripts\") pod \"12c1f3f2-32ce-4652-8962-99c0d111d953\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.893734 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-combined-ca-bundle\") pod \"12c1f3f2-32ce-4652-8962-99c0d111d953\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.893812 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfj2t\" (UniqueName: \"kubernetes.io/projected/12c1f3f2-32ce-4652-8962-99c0d111d953-kube-api-access-nfj2t\") pod \"12c1f3f2-32ce-4652-8962-99c0d111d953\" (UID: \"12c1f3f2-32ce-4652-8962-99c0d111d953\") " Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.899253 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-scripts" (OuterVolumeSpecName: "scripts") pod "12c1f3f2-32ce-4652-8962-99c0d111d953" (UID: "12c1f3f2-32ce-4652-8962-99c0d111d953"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.908747 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c1f3f2-32ce-4652-8962-99c0d111d953-kube-api-access-nfj2t" (OuterVolumeSpecName: "kube-api-access-nfj2t") pod "12c1f3f2-32ce-4652-8962-99c0d111d953" (UID: "12c1f3f2-32ce-4652-8962-99c0d111d953"). InnerVolumeSpecName "kube-api-access-nfj2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.925490 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-config-data" (OuterVolumeSpecName: "config-data") pod "12c1f3f2-32ce-4652-8962-99c0d111d953" (UID: "12c1f3f2-32ce-4652-8962-99c0d111d953"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.933603 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12c1f3f2-32ce-4652-8962-99c0d111d953" (UID: "12c1f3f2-32ce-4652-8962-99c0d111d953"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.997045 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfj2t\" (UniqueName: \"kubernetes.io/projected/12c1f3f2-32ce-4652-8962-99c0d111d953-kube-api-access-nfj2t\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.997085 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.997098 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:59 crc kubenswrapper[4751]: I1203 14:38:59.997108 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c1f3f2-32ce-4652-8962-99c0d111d953-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.358883 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fpjws" Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.358880 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fpjws" event={"ID":"12c1f3f2-32ce-4652-8962-99c0d111d953","Type":"ContainerDied","Data":"9ad73af35d64915ac0de5cffbb38be2e7d6a01f7bceb11ebff5afc556da6c427"} Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.359066 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ad73af35d64915ac0de5cffbb38be2e7d6a01f7bceb11ebff5afc556da6c427" Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.365008 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerStarted","Data":"dd824341d02ef5aabaf33619a77ee4ccd0677628a1d37b03f92e85328b516228"} Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.365073 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerStarted","Data":"6ee958e311e919d9575be0a8203c2a9edd82b4dcc13b156a4075163e67a62f31"} Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.520538 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.521253 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aba7b141-640d-4504-bf82-90d304479537" containerName="nova-api-log" containerID="cri-o://1cbc5a93eb80f7a9da58de63b22264b91f7544620affc2f92d7fd3c154f9b5f2" gracePeriod=30 Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.521366 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aba7b141-640d-4504-bf82-90d304479537" containerName="nova-api-api" containerID="cri-o://741bbf19c3092902ec5da2466342d3ee77cff6ec99b9d91da3be06bb213f6436" gracePeriod=30 Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.536587 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.536937 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3451b19b-19d8-4a15-8bd7-920525f3335f" containerName="nova-scheduler-scheduler" containerID="cri-o://3a704fbee886f7b3a6aaddbec352b42e09fb12bb46e20a03e223b223bf68c150" gracePeriod=30 Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.558081 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.558382 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-log" containerID="cri-o://28ceaa88466c0c30df0cce89a61465f5879b57437200204b85f7092668787149" gracePeriod=30 Dec 03 14:39:00 crc kubenswrapper[4751]: I1203 14:39:00.558540 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-metadata" containerID="cri-o://cadfa921d66bd20ee697ec890d6a9c90ade6f816a31c6012b9ec7a167cd2fb29" gracePeriod=30 Dec 03 14:39:00 crc kubenswrapper[4751]: E1203 14:39:00.744238 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12c1f3f2_32ce_4652_8962_99c0d111d953.slice\": RecentStats: unable to find data in memory cache]" Dec 03 14:39:01 crc kubenswrapper[4751]: I1203 14:39:01.378554 4751 generic.go:334] "Generic (PLEG): container finished" podID="aba7b141-640d-4504-bf82-90d304479537" containerID="741bbf19c3092902ec5da2466342d3ee77cff6ec99b9d91da3be06bb213f6436" exitCode=0 Dec 03 14:39:01 crc kubenswrapper[4751]: I1203 14:39:01.379558 4751 generic.go:334] "Generic (PLEG): container finished" podID="aba7b141-640d-4504-bf82-90d304479537" containerID="1cbc5a93eb80f7a9da58de63b22264b91f7544620affc2f92d7fd3c154f9b5f2" exitCode=143 Dec 03 14:39:01 crc kubenswrapper[4751]: I1203 14:39:01.379656 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aba7b141-640d-4504-bf82-90d304479537","Type":"ContainerDied","Data":"741bbf19c3092902ec5da2466342d3ee77cff6ec99b9d91da3be06bb213f6436"} Dec 03 14:39:01 crc kubenswrapper[4751]: I1203 14:39:01.379729 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aba7b141-640d-4504-bf82-90d304479537","Type":"ContainerDied","Data":"1cbc5a93eb80f7a9da58de63b22264b91f7544620affc2f92d7fd3c154f9b5f2"} Dec 03 14:39:01 crc kubenswrapper[4751]: I1203 14:39:01.382571 4751 generic.go:334] "Generic (PLEG): container finished" podID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerID="28ceaa88466c0c30df0cce89a61465f5879b57437200204b85f7092668787149" exitCode=143 Dec 03 14:39:01 crc kubenswrapper[4751]: I1203 14:39:01.382610 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a0018be-7a12-4006-ae3c-0d5b60837a95","Type":"ContainerDied","Data":"28ceaa88466c0c30df0cce89a61465f5879b57437200204b85f7092668787149"} Dec 03 14:39:01 crc kubenswrapper[4751]: I1203 14:39:01.950235 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.044234 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw4jw\" (UniqueName: \"kubernetes.io/projected/aba7b141-640d-4504-bf82-90d304479537-kube-api-access-bw4jw\") pod \"aba7b141-640d-4504-bf82-90d304479537\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.044346 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-combined-ca-bundle\") pod \"aba7b141-640d-4504-bf82-90d304479537\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.044378 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aba7b141-640d-4504-bf82-90d304479537-logs\") pod \"aba7b141-640d-4504-bf82-90d304479537\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.044442 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-internal-tls-certs\") pod \"aba7b141-640d-4504-bf82-90d304479537\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.044572 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-config-data\") pod \"aba7b141-640d-4504-bf82-90d304479537\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.044749 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-public-tls-certs\") pod \"aba7b141-640d-4504-bf82-90d304479537\" (UID: \"aba7b141-640d-4504-bf82-90d304479537\") " Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.045293 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aba7b141-640d-4504-bf82-90d304479537-logs" (OuterVolumeSpecName: "logs") pod "aba7b141-640d-4504-bf82-90d304479537" (UID: "aba7b141-640d-4504-bf82-90d304479537"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.049433 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba7b141-640d-4504-bf82-90d304479537-kube-api-access-bw4jw" (OuterVolumeSpecName: "kube-api-access-bw4jw") pod "aba7b141-640d-4504-bf82-90d304479537" (UID: "aba7b141-640d-4504-bf82-90d304479537"). InnerVolumeSpecName "kube-api-access-bw4jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.076760 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-config-data" (OuterVolumeSpecName: "config-data") pod "aba7b141-640d-4504-bf82-90d304479537" (UID: "aba7b141-640d-4504-bf82-90d304479537"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.081427 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aba7b141-640d-4504-bf82-90d304479537" (UID: "aba7b141-640d-4504-bf82-90d304479537"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.103090 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aba7b141-640d-4504-bf82-90d304479537" (UID: "aba7b141-640d-4504-bf82-90d304479537"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.103845 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "aba7b141-640d-4504-bf82-90d304479537" (UID: "aba7b141-640d-4504-bf82-90d304479537"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.146987 4751 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.147025 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw4jw\" (UniqueName: \"kubernetes.io/projected/aba7b141-640d-4504-bf82-90d304479537-kube-api-access-bw4jw\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.147038 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.147047 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aba7b141-640d-4504-bf82-90d304479537-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.147058 4751 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.147067 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba7b141-640d-4504-bf82-90d304479537-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.394945 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerStarted","Data":"e755eca85429f83c1622e6577de9b3998331f07b78b14d6c316ea880040973e7"} Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.395126 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.397787 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aba7b141-640d-4504-bf82-90d304479537","Type":"ContainerDied","Data":"e1ee653e4568461cd4d08901161a835a2733d90d5d427bdb37c3b601200e2253"} Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.397851 4751 scope.go:117] "RemoveContainer" containerID="741bbf19c3092902ec5da2466342d3ee77cff6ec99b9d91da3be06bb213f6436" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.397863 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.428395 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.101120167 podStartE2EDuration="5.428367486s" podCreationTimestamp="2025-12-03 14:38:57 +0000 UTC" firstStartedPulling="2025-12-03 14:38:58.137805486 +0000 UTC m=+1545.126160703" lastFinishedPulling="2025-12-03 14:39:01.465052805 +0000 UTC m=+1548.453408022" observedRunningTime="2025-12-03 14:39:02.421172915 +0000 UTC m=+1549.409528142" watchObservedRunningTime="2025-12-03 14:39:02.428367486 +0000 UTC m=+1549.416722713" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.434827 4751 scope.go:117] "RemoveContainer" containerID="1cbc5a93eb80f7a9da58de63b22264b91f7544620affc2f92d7fd3c154f9b5f2" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.465102 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.477760 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.516394 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 14:39:02 crc kubenswrapper[4751]: E1203 14:39:02.516923 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba7b141-640d-4504-bf82-90d304479537" containerName="nova-api-api" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.516946 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba7b141-640d-4504-bf82-90d304479537" containerName="nova-api-api" Dec 03 14:39:02 crc kubenswrapper[4751]: E1203 14:39:02.516976 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c1f3f2-32ce-4652-8962-99c0d111d953" containerName="nova-manage" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.516983 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c1f3f2-32ce-4652-8962-99c0d111d953" containerName="nova-manage" Dec 03 14:39:02 crc kubenswrapper[4751]: E1203 14:39:02.517016 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba7b141-640d-4504-bf82-90d304479537" containerName="nova-api-log" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.517024 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba7b141-640d-4504-bf82-90d304479537" containerName="nova-api-log" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.517226 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="aba7b141-640d-4504-bf82-90d304479537" containerName="nova-api-api" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.517247 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c1f3f2-32ce-4652-8962-99c0d111d953" containerName="nova-manage" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.517266 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="aba7b141-640d-4504-bf82-90d304479537" containerName="nova-api-log" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.518637 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.520405 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.521676 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.526585 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.554264 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzwx4\" (UniqueName: \"kubernetes.io/projected/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-kube-api-access-vzwx4\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.554356 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-config-data\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.554459 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.554477 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.554515 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-logs\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.554535 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-public-tls-certs\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.557645 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.658809 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.658897 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.659111 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-logs\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.659198 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-public-tls-certs\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.660089 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzwx4\" (UniqueName: \"kubernetes.io/projected/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-kube-api-access-vzwx4\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.660175 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-config-data\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.660239 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-logs\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.664480 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-config-data\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.664530 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.665114 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-public-tls-certs\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.665164 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.678903 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzwx4\" (UniqueName: \"kubernetes.io/projected/d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d-kube-api-access-vzwx4\") pod \"nova-api-0\" (UID: \"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d\") " pod="openstack/nova-api-0" Dec 03 14:39:02 crc kubenswrapper[4751]: I1203 14:39:02.870818 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.167691 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.331387 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba7b141-640d-4504-bf82-90d304479537" path="/var/lib/kubelet/pods/aba7b141-640d-4504-bf82-90d304479537/volumes" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.421619 4751 generic.go:334] "Generic (PLEG): container finished" podID="3451b19b-19d8-4a15-8bd7-920525f3335f" containerID="3a704fbee886f7b3a6aaddbec352b42e09fb12bb46e20a03e223b223bf68c150" exitCode=0 Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.421686 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3451b19b-19d8-4a15-8bd7-920525f3335f","Type":"ContainerDied","Data":"3a704fbee886f7b3a6aaddbec352b42e09fb12bb46e20a03e223b223bf68c150"} Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.424725 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d","Type":"ContainerStarted","Data":"6061c983e81d29039ea32f21b442b9da1e0ad86c2ad697a83223abcc21b31a72"} Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.424758 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d","Type":"ContainerStarted","Data":"b5bc55083ebbb1a0374ac7fde66f187eb9128978cb50c490ce4113056795b3c8"} Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.719775 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.786270 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-combined-ca-bundle\") pod \"3451b19b-19d8-4a15-8bd7-920525f3335f\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.786442 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-config-data\") pod \"3451b19b-19d8-4a15-8bd7-920525f3335f\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.786521 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hrwp\" (UniqueName: \"kubernetes.io/projected/3451b19b-19d8-4a15-8bd7-920525f3335f-kube-api-access-9hrwp\") pod \"3451b19b-19d8-4a15-8bd7-920525f3335f\" (UID: \"3451b19b-19d8-4a15-8bd7-920525f3335f\") " Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.790360 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3451b19b-19d8-4a15-8bd7-920525f3335f-kube-api-access-9hrwp" (OuterVolumeSpecName: "kube-api-access-9hrwp") pod "3451b19b-19d8-4a15-8bd7-920525f3335f" (UID: "3451b19b-19d8-4a15-8bd7-920525f3335f"). InnerVolumeSpecName "kube-api-access-9hrwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.819115 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-config-data" (OuterVolumeSpecName: "config-data") pod "3451b19b-19d8-4a15-8bd7-920525f3335f" (UID: "3451b19b-19d8-4a15-8bd7-920525f3335f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.820249 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3451b19b-19d8-4a15-8bd7-920525f3335f" (UID: "3451b19b-19d8-4a15-8bd7-920525f3335f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.889045 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.889074 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451b19b-19d8-4a15-8bd7-920525f3335f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.889084 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hrwp\" (UniqueName: \"kubernetes.io/projected/3451b19b-19d8-4a15-8bd7-920525f3335f-kube-api-access-9hrwp\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.940411 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": read tcp 10.217.0.2:60782->10.217.0.221:8775: read: connection reset by peer" Dec 03 14:39:03 crc kubenswrapper[4751]: I1203 14:39:03.940444 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": read tcp 10.217.0.2:60780->10.217.0.221:8775: read: connection reset by peer" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.439853 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d","Type":"ContainerStarted","Data":"32bf28e0f9137edf3238b382e0efa1e5e0a0619857fa3a454e485f13075b75d0"} Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.445802 4751 generic.go:334] "Generic (PLEG): container finished" podID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerID="cadfa921d66bd20ee697ec890d6a9c90ade6f816a31c6012b9ec7a167cd2fb29" exitCode=0 Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.445885 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a0018be-7a12-4006-ae3c-0d5b60837a95","Type":"ContainerDied","Data":"cadfa921d66bd20ee697ec890d6a9c90ade6f816a31c6012b9ec7a167cd2fb29"} Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.447820 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3451b19b-19d8-4a15-8bd7-920525f3335f","Type":"ContainerDied","Data":"c2b0c2aca4694db51ab09b6e38b07b9115189630864288bf12fb18916cf53ee8"} Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.447897 4751 scope.go:117] "RemoveContainer" containerID="3a704fbee886f7b3a6aaddbec352b42e09fb12bb46e20a03e223b223bf68c150" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.448000 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.471146 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.471120697 podStartE2EDuration="2.471120697s" podCreationTimestamp="2025-12-03 14:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:39:04.46974973 +0000 UTC m=+1551.458104947" watchObservedRunningTime="2025-12-03 14:39:04.471120697 +0000 UTC m=+1551.459475914" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.585180 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.595580 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.604911 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-config-data\") pod \"0a0018be-7a12-4006-ae3c-0d5b60837a95\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.604995 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0018be-7a12-4006-ae3c-0d5b60837a95-logs\") pod \"0a0018be-7a12-4006-ae3c-0d5b60837a95\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.605038 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmdc6\" (UniqueName: \"kubernetes.io/projected/0a0018be-7a12-4006-ae3c-0d5b60837a95-kube-api-access-zmdc6\") pod \"0a0018be-7a12-4006-ae3c-0d5b60837a95\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.605259 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-combined-ca-bundle\") pod \"0a0018be-7a12-4006-ae3c-0d5b60837a95\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.605481 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a0018be-7a12-4006-ae3c-0d5b60837a95-logs" (OuterVolumeSpecName: "logs") pod "0a0018be-7a12-4006-ae3c-0d5b60837a95" (UID: "0a0018be-7a12-4006-ae3c-0d5b60837a95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.605729 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-nova-metadata-tls-certs\") pod \"0a0018be-7a12-4006-ae3c-0d5b60837a95\" (UID: \"0a0018be-7a12-4006-ae3c-0d5b60837a95\") " Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.606616 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0018be-7a12-4006-ae3c-0d5b60837a95-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.611786 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0018be-7a12-4006-ae3c-0d5b60837a95-kube-api-access-zmdc6" (OuterVolumeSpecName: "kube-api-access-zmdc6") pod "0a0018be-7a12-4006-ae3c-0d5b60837a95" (UID: "0a0018be-7a12-4006-ae3c-0d5b60837a95"). InnerVolumeSpecName "kube-api-access-zmdc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.611939 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.638021 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:39:04 crc kubenswrapper[4751]: E1203 14:39:04.638516 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3451b19b-19d8-4a15-8bd7-920525f3335f" containerName="nova-scheduler-scheduler" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.638536 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3451b19b-19d8-4a15-8bd7-920525f3335f" containerName="nova-scheduler-scheduler" Dec 03 14:39:04 crc kubenswrapper[4751]: E1203 14:39:04.638550 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-metadata" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.638557 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-metadata" Dec 03 14:39:04 crc kubenswrapper[4751]: E1203 14:39:04.638570 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-log" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.638576 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-log" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.638755 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-metadata" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.638773 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" containerName="nova-metadata-log" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.638789 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="3451b19b-19d8-4a15-8bd7-920525f3335f" containerName="nova-scheduler-scheduler" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.639647 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.646744 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.653668 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-config-data" (OuterVolumeSpecName: "config-data") pod "0a0018be-7a12-4006-ae3c-0d5b60837a95" (UID: "0a0018be-7a12-4006-ae3c-0d5b60837a95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.661169 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.697525 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0a0018be-7a12-4006-ae3c-0d5b60837a95" (UID: "0a0018be-7a12-4006-ae3c-0d5b60837a95"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.704729 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a0018be-7a12-4006-ae3c-0d5b60837a95" (UID: "0a0018be-7a12-4006-ae3c-0d5b60837a95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.707922 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbe3db84-a6ac-4b03-999b-1d2663641afa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbe3db84-a6ac-4b03-999b-1d2663641afa\") " pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.708080 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbe3db84-a6ac-4b03-999b-1d2663641afa-config-data\") pod \"nova-scheduler-0\" (UID: \"dbe3db84-a6ac-4b03-999b-1d2663641afa\") " pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.708230 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhn4\" (UniqueName: \"kubernetes.io/projected/dbe3db84-a6ac-4b03-999b-1d2663641afa-kube-api-access-dbhn4\") pod \"nova-scheduler-0\" (UID: \"dbe3db84-a6ac-4b03-999b-1d2663641afa\") " pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.708590 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmdc6\" (UniqueName: \"kubernetes.io/projected/0a0018be-7a12-4006-ae3c-0d5b60837a95-kube-api-access-zmdc6\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.708676 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.708751 4751 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.708826 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0018be-7a12-4006-ae3c-0d5b60837a95-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.813274 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbe3db84-a6ac-4b03-999b-1d2663641afa-config-data\") pod \"nova-scheduler-0\" (UID: \"dbe3db84-a6ac-4b03-999b-1d2663641afa\") " pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.813487 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhn4\" (UniqueName: \"kubernetes.io/projected/dbe3db84-a6ac-4b03-999b-1d2663641afa-kube-api-access-dbhn4\") pod \"nova-scheduler-0\" (UID: \"dbe3db84-a6ac-4b03-999b-1d2663641afa\") " pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.813642 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbe3db84-a6ac-4b03-999b-1d2663641afa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbe3db84-a6ac-4b03-999b-1d2663641afa\") " pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.819295 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbe3db84-a6ac-4b03-999b-1d2663641afa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbe3db84-a6ac-4b03-999b-1d2663641afa\") " pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.822369 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbe3db84-a6ac-4b03-999b-1d2663641afa-config-data\") pod \"nova-scheduler-0\" (UID: \"dbe3db84-a6ac-4b03-999b-1d2663641afa\") " pod="openstack/nova-scheduler-0" Dec 03 14:39:04 crc kubenswrapper[4751]: I1203 14:39:04.838170 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhn4\" (UniqueName: \"kubernetes.io/projected/dbe3db84-a6ac-4b03-999b-1d2663641afa-kube-api-access-dbhn4\") pod \"nova-scheduler-0\" (UID: \"dbe3db84-a6ac-4b03-999b-1d2663641afa\") " pod="openstack/nova-scheduler-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.115947 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.331716 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3451b19b-19d8-4a15-8bd7-920525f3335f" path="/var/lib/kubelet/pods/3451b19b-19d8-4a15-8bd7-920525f3335f/volumes" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.460873 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a0018be-7a12-4006-ae3c-0d5b60837a95","Type":"ContainerDied","Data":"8a0d6f9780502bbd5df54061dc7d90e092573634bc4d7e2a0a2dad7f695bf441"} Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.460933 4751 scope.go:117] "RemoveContainer" containerID="cadfa921d66bd20ee697ec890d6a9c90ade6f816a31c6012b9ec7a167cd2fb29" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.461106 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.498282 4751 scope.go:117] "RemoveContainer" containerID="28ceaa88466c0c30df0cce89a61465f5879b57437200204b85f7092668787149" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.511082 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.521673 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.550276 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.552579 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.557560 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.558202 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.567400 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.615744 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.634646 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47692780-6643-491b-8d92-c181c82d4ce6-logs\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.634721 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw8tw\" (UniqueName: \"kubernetes.io/projected/47692780-6643-491b-8d92-c181c82d4ce6-kube-api-access-lw8tw\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.634817 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/47692780-6643-491b-8d92-c181c82d4ce6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.634955 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47692780-6643-491b-8d92-c181c82d4ce6-config-data\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.634999 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47692780-6643-491b-8d92-c181c82d4ce6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.736753 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47692780-6643-491b-8d92-c181c82d4ce6-config-data\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.736790 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47692780-6643-491b-8d92-c181c82d4ce6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.736853 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47692780-6643-491b-8d92-c181c82d4ce6-logs\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.736878 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw8tw\" (UniqueName: \"kubernetes.io/projected/47692780-6643-491b-8d92-c181c82d4ce6-kube-api-access-lw8tw\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.736933 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/47692780-6643-491b-8d92-c181c82d4ce6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.738023 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47692780-6643-491b-8d92-c181c82d4ce6-logs\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.741048 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47692780-6643-491b-8d92-c181c82d4ce6-config-data\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.741630 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47692780-6643-491b-8d92-c181c82d4ce6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.742478 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/47692780-6643-491b-8d92-c181c82d4ce6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.754431 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw8tw\" (UniqueName: \"kubernetes.io/projected/47692780-6643-491b-8d92-c181c82d4ce6-kube-api-access-lw8tw\") pod \"nova-metadata-0\" (UID: \"47692780-6643-491b-8d92-c181c82d4ce6\") " pod="openstack/nova-metadata-0" Dec 03 14:39:05 crc kubenswrapper[4751]: I1203 14:39:05.878832 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:39:06 crc kubenswrapper[4751]: W1203 14:39:06.320301 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47692780_6643_491b_8d92_c181c82d4ce6.slice/crio-3373ae9f9bb21f4a433109f2cb980f0210e193a316e36f6b06a30ad4f0e0ff1c WatchSource:0}: Error finding container 3373ae9f9bb21f4a433109f2cb980f0210e193a316e36f6b06a30ad4f0e0ff1c: Status 404 returned error can't find the container with id 3373ae9f9bb21f4a433109f2cb980f0210e193a316e36f6b06a30ad4f0e0ff1c Dec 03 14:39:06 crc kubenswrapper[4751]: I1203 14:39:06.321917 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:39:06 crc kubenswrapper[4751]: I1203 14:39:06.492439 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbe3db84-a6ac-4b03-999b-1d2663641afa","Type":"ContainerStarted","Data":"7439f7ca9b736f5837473a61a6f6cccb19b8ceb7bbb31a94848f18030c840983"} Dec 03 14:39:06 crc kubenswrapper[4751]: I1203 14:39:06.493349 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbe3db84-a6ac-4b03-999b-1d2663641afa","Type":"ContainerStarted","Data":"4c4339f5bb009bb33e4d95a3d2191708b4a19b43048bdcb57fa4535531102081"} Dec 03 14:39:06 crc kubenswrapper[4751]: I1203 14:39:06.495457 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"47692780-6643-491b-8d92-c181c82d4ce6","Type":"ContainerStarted","Data":"3373ae9f9bb21f4a433109f2cb980f0210e193a316e36f6b06a30ad4f0e0ff1c"} Dec 03 14:39:06 crc kubenswrapper[4751]: I1203 14:39:06.515234 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.51520909 podStartE2EDuration="2.51520909s" podCreationTimestamp="2025-12-03 14:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:39:06.511974033 +0000 UTC m=+1553.500329240" watchObservedRunningTime="2025-12-03 14:39:06.51520909 +0000 UTC m=+1553.503564327" Dec 03 14:39:07 crc kubenswrapper[4751]: I1203 14:39:07.328992 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a0018be-7a12-4006-ae3c-0d5b60837a95" path="/var/lib/kubelet/pods/0a0018be-7a12-4006-ae3c-0d5b60837a95/volumes" Dec 03 14:39:07 crc kubenswrapper[4751]: I1203 14:39:07.507921 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"47692780-6643-491b-8d92-c181c82d4ce6","Type":"ContainerStarted","Data":"c798d32db4acf3d8b39f36af3049e3b2a15d5bf4c6eae059598fd25ea9136097"} Dec 03 14:39:07 crc kubenswrapper[4751]: I1203 14:39:07.508200 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"47692780-6643-491b-8d92-c181c82d4ce6","Type":"ContainerStarted","Data":"75083b269e8e33b8cf96ce4ba3d64207eb5917fd5dc5aa429a0b0bcaaebb98d2"} Dec 03 14:39:07 crc kubenswrapper[4751]: I1203 14:39:07.535922 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.535901394 podStartE2EDuration="2.535901394s" podCreationTimestamp="2025-12-03 14:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:39:07.523283052 +0000 UTC m=+1554.511638269" watchObservedRunningTime="2025-12-03 14:39:07.535901394 +0000 UTC m=+1554.524256611" Dec 03 14:39:10 crc kubenswrapper[4751]: I1203 14:39:10.116504 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 14:39:10 crc kubenswrapper[4751]: I1203 14:39:10.880368 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:39:10 crc kubenswrapper[4751]: I1203 14:39:10.880489 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:39:12 crc kubenswrapper[4751]: I1203 14:39:12.871017 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:39:12 crc kubenswrapper[4751]: I1203 14:39:12.871301 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:39:13 crc kubenswrapper[4751]: I1203 14:39:13.882593 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.232:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:39:13 crc kubenswrapper[4751]: I1203 14:39:13.882983 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.232:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:39:15 crc kubenswrapper[4751]: I1203 14:39:15.116469 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 14:39:15 crc kubenswrapper[4751]: I1203 14:39:15.153361 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 14:39:15 crc kubenswrapper[4751]: I1203 14:39:15.631554 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 14:39:15 crc kubenswrapper[4751]: I1203 14:39:15.880031 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 14:39:15 crc kubenswrapper[4751]: I1203 14:39:15.880297 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 14:39:16 crc kubenswrapper[4751]: I1203 14:39:16.894563 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="47692780-6643-491b-8d92-c181c82d4ce6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.234:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:39:16 crc kubenswrapper[4751]: I1203 14:39:16.894563 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="47692780-6643-491b-8d92-c181c82d4ce6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.234:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:39:22 crc kubenswrapper[4751]: I1203 14:39:22.881037 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 14:39:22 crc kubenswrapper[4751]: I1203 14:39:22.881841 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 14:39:22 crc kubenswrapper[4751]: I1203 14:39:22.887344 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 14:39:22 crc kubenswrapper[4751]: I1203 14:39:22.888341 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 14:39:23 crc kubenswrapper[4751]: I1203 14:39:23.687472 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 14:39:23 crc kubenswrapper[4751]: I1203 14:39:23.713311 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 14:39:25 crc kubenswrapper[4751]: I1203 14:39:25.886662 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 14:39:25 crc kubenswrapper[4751]: I1203 14:39:25.889310 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 14:39:25 crc kubenswrapper[4751]: I1203 14:39:25.894205 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 14:39:26 crc kubenswrapper[4751]: I1203 14:39:26.721950 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 14:39:27 crc kubenswrapper[4751]: I1203 14:39:27.683637 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.785711 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-bgs8x"] Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.795410 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-bgs8x"] Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.899780 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-m2bmn"] Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.901110 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.904638 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.913088 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-m2bmn"] Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.984684 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znnqk\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-kube-api-access-znnqk\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.984737 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-config-data\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.984763 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-scripts\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.984806 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-combined-ca-bundle\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:39 crc kubenswrapper[4751]: I1203 14:39:39.985149 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-certs\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.087536 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-certs\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.087675 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znnqk\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-kube-api-access-znnqk\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.087705 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-config-data\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.087726 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-scripts\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.087763 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-combined-ca-bundle\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.093627 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-certs\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.094145 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-config-data\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.094909 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-scripts\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.112135 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-combined-ca-bundle\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.117090 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znnqk\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-kube-api-access-znnqk\") pod \"cloudkitty-db-sync-m2bmn\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.222760 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.807607 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-m2bmn"] Dec 03 14:39:40 crc kubenswrapper[4751]: I1203 14:39:40.882523 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-m2bmn" event={"ID":"acdd8764-947f-44ce-a5bd-4f3c139d581c","Type":"ContainerStarted","Data":"9a20d373fcbcb3c4c7a8e9e94f422c3624a279146b94b28f36b3440d62f6565b"} Dec 03 14:39:41 crc kubenswrapper[4751]: I1203 14:39:41.327987 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4df7d14f-4a52-43be-9877-c5df9c015cc7" path="/var/lib/kubelet/pods/4df7d14f-4a52-43be-9877-c5df9c015cc7/volumes" Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.006747 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.009426 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="ceilometer-central-agent" containerID="cri-o://726ff0394ceadcdd06af4efc79f4a8665066f986754e42516e62c9211c9077a6" gracePeriod=30 Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.009850 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="proxy-httpd" containerID="cri-o://e755eca85429f83c1622e6577de9b3998331f07b78b14d6c316ea880040973e7" gracePeriod=30 Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.010182 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="ceilometer-notification-agent" containerID="cri-o://6ee958e311e919d9575be0a8203c2a9edd82b4dcc13b156a4075163e67a62f31" gracePeriod=30 Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.010270 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="sg-core" containerID="cri-o://dd824341d02ef5aabaf33619a77ee4ccd0677628a1d37b03f92e85328b516228" gracePeriod=30 Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.104229 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.924488 4751 generic.go:334] "Generic (PLEG): container finished" podID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerID="e755eca85429f83c1622e6577de9b3998331f07b78b14d6c316ea880040973e7" exitCode=0 Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.924782 4751 generic.go:334] "Generic (PLEG): container finished" podID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerID="dd824341d02ef5aabaf33619a77ee4ccd0677628a1d37b03f92e85328b516228" exitCode=2 Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.924793 4751 generic.go:334] "Generic (PLEG): container finished" podID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerID="726ff0394ceadcdd06af4efc79f4a8665066f986754e42516e62c9211c9077a6" exitCode=0 Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.924823 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerDied","Data":"e755eca85429f83c1622e6577de9b3998331f07b78b14d6c316ea880040973e7"} Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.924848 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerDied","Data":"dd824341d02ef5aabaf33619a77ee4ccd0677628a1d37b03f92e85328b516228"} Dec 03 14:39:42 crc kubenswrapper[4751]: I1203 14:39:42.924858 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerDied","Data":"726ff0394ceadcdd06af4efc79f4a8665066f986754e42516e62c9211c9077a6"} Dec 03 14:39:43 crc kubenswrapper[4751]: I1203 14:39:43.135948 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:39:44 crc kubenswrapper[4751]: I1203 14:39:44.952816 4751 generic.go:334] "Generic (PLEG): container finished" podID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerID="6ee958e311e919d9575be0a8203c2a9edd82b4dcc13b156a4075163e67a62f31" exitCode=0 Dec 03 14:39:44 crc kubenswrapper[4751]: I1203 14:39:44.952866 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerDied","Data":"6ee958e311e919d9575be0a8203c2a9edd82b4dcc13b156a4075163e67a62f31"} Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.572892 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.633074 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-log-httpd\") pod \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.633298 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-sg-core-conf-yaml\") pod \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.633422 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-ceilometer-tls-certs\") pod \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.633541 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-combined-ca-bundle\") pod \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.633668 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-run-httpd\") pod \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.633815 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqmhl\" (UniqueName: \"kubernetes.io/projected/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-kube-api-access-cqmhl\") pod \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.634001 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-scripts\") pod \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.634071 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-config-data\") pod \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\" (UID: \"65a4cf75-e487-4f9b-989c-d6a7d6f0de30\") " Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.643157 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "65a4cf75-e487-4f9b-989c-d6a7d6f0de30" (UID: "65a4cf75-e487-4f9b-989c-d6a7d6f0de30"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.649774 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-scripts" (OuterVolumeSpecName: "scripts") pod "65a4cf75-e487-4f9b-989c-d6a7d6f0de30" (UID: "65a4cf75-e487-4f9b-989c-d6a7d6f0de30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.650255 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "65a4cf75-e487-4f9b-989c-d6a7d6f0de30" (UID: "65a4cf75-e487-4f9b-989c-d6a7d6f0de30"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.693354 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-kube-api-access-cqmhl" (OuterVolumeSpecName: "kube-api-access-cqmhl") pod "65a4cf75-e487-4f9b-989c-d6a7d6f0de30" (UID: "65a4cf75-e487-4f9b-989c-d6a7d6f0de30"). InnerVolumeSpecName "kube-api-access-cqmhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.710249 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "65a4cf75-e487-4f9b-989c-d6a7d6f0de30" (UID: "65a4cf75-e487-4f9b-989c-d6a7d6f0de30"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.736951 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.737110 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.737336 4751 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.737478 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.737574 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqmhl\" (UniqueName: \"kubernetes.io/projected/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-kube-api-access-cqmhl\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.776517 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "65a4cf75-e487-4f9b-989c-d6a7d6f0de30" (UID: "65a4cf75-e487-4f9b-989c-d6a7d6f0de30"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.779240 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65a4cf75-e487-4f9b-989c-d6a7d6f0de30" (UID: "65a4cf75-e487-4f9b-989c-d6a7d6f0de30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.826901 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-config-data" (OuterVolumeSpecName: "config-data") pod "65a4cf75-e487-4f9b-989c-d6a7d6f0de30" (UID: "65a4cf75-e487-4f9b-989c-d6a7d6f0de30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.839492 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.839727 4751 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.839830 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a4cf75-e487-4f9b-989c-d6a7d6f0de30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.987886 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a4cf75-e487-4f9b-989c-d6a7d6f0de30","Type":"ContainerDied","Data":"16fd6e4a15234565c49006188c700e9d4a47e8982406a38eacd94b047c7a0a86"} Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.987939 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:39:46 crc kubenswrapper[4751]: I1203 14:39:46.987965 4751 scope.go:117] "RemoveContainer" containerID="e755eca85429f83c1622e6577de9b3998331f07b78b14d6c316ea880040973e7" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.028918 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" containerName="rabbitmq" containerID="cri-o://e31623ebc89c82e42314f91e2333489512ed54bb1026dab75b7fff94b80c1f8a" gracePeriod=604796 Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.029371 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.030827 4751 scope.go:117] "RemoveContainer" containerID="dd824341d02ef5aabaf33619a77ee4ccd0677628a1d37b03f92e85328b516228" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.041181 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.068817 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:39:47 crc kubenswrapper[4751]: E1203 14:39:47.069313 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="ceilometer-central-agent" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.069356 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="ceilometer-central-agent" Dec 03 14:39:47 crc kubenswrapper[4751]: E1203 14:39:47.069378 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="ceilometer-notification-agent" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.069385 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="ceilometer-notification-agent" Dec 03 14:39:47 crc kubenswrapper[4751]: E1203 14:39:47.069402 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="sg-core" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.069416 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="sg-core" Dec 03 14:39:47 crc kubenswrapper[4751]: E1203 14:39:47.069429 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="proxy-httpd" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.069437 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="proxy-httpd" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.069648 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="ceilometer-notification-agent" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.069676 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="sg-core" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.069692 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="proxy-httpd" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.069708 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" containerName="ceilometer-central-agent" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.071753 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.084579 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.084960 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.085160 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.103572 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.111587 4751 scope.go:117] "RemoveContainer" containerID="6ee958e311e919d9575be0a8203c2a9edd82b4dcc13b156a4075163e67a62f31" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.142052 4751 scope.go:117] "RemoveContainer" containerID="726ff0394ceadcdd06af4efc79f4a8665066f986754e42516e62c9211c9077a6" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.147138 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-scripts\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.147210 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-log-httpd\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.147238 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-config-data\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.147492 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.147651 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f58f6\" (UniqueName: \"kubernetes.io/projected/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-kube-api-access-f58f6\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.147692 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-run-httpd\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.147833 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.147880 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.249537 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f58f6\" (UniqueName: \"kubernetes.io/projected/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-kube-api-access-f58f6\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.249594 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-run-httpd\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.249697 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.249738 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.249762 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-scripts\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.249809 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-log-httpd\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.249829 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-config-data\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.249871 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.250769 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-log-httpd\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.250980 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-run-httpd\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.255012 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-scripts\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.255039 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.255145 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.255586 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.260774 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-config-data\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.269135 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f58f6\" (UniqueName: \"kubernetes.io/projected/ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991-kube-api-access-f58f6\") pod \"ceilometer-0\" (UID: \"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991\") " pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.326898 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a4cf75-e487-4f9b-989c-d6a7d6f0de30" path="/var/lib/kubelet/pods/65a4cf75-e487-4f9b-989c-d6a7d6f0de30/volumes" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.408393 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:39:47 crc kubenswrapper[4751]: I1203 14:39:47.939843 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:39:48 crc kubenswrapper[4751]: I1203 14:39:48.018039 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991","Type":"ContainerStarted","Data":"fa7a6f66d96c0c5bc69408acd884c21c28f5f733646b180a83080d4b6ec04922"} Dec 03 14:39:48 crc kubenswrapper[4751]: I1203 14:39:48.956805 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerName="rabbitmq" containerID="cri-o://a8c83b1d5f1a85ae4bfd25ddb57357a3145271a6c2fb53b90694087de6ba2f1d" gracePeriod=604795 Dec 03 14:39:50 crc kubenswrapper[4751]: I1203 14:39:50.853014 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Dec 03 14:39:51 crc kubenswrapper[4751]: I1203 14:39:51.211765 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Dec 03 14:40:00 crc kubenswrapper[4751]: I1203 14:40:00.853000 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Dec 03 14:40:01 crc kubenswrapper[4751]: I1203 14:40:01.211765 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.179798 4751 generic.go:334] "Generic (PLEG): container finished" podID="47b63367-ad69-4428-9c79-8eee86b817ac" containerID="e31623ebc89c82e42314f91e2333489512ed54bb1026dab75b7fff94b80c1f8a" exitCode=0 Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.179850 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"47b63367-ad69-4428-9c79-8eee86b817ac","Type":"ContainerDied","Data":"e31623ebc89c82e42314f91e2333489512ed54bb1026dab75b7fff94b80c1f8a"} Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.398790 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-gf5gz"] Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.402170 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.404895 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.432043 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-gf5gz"] Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.481196 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-openstack-edpm-ipam\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.481254 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh26s\" (UniqueName: \"kubernetes.io/projected/35c811c6-3da1-42ec-a2e0-78afb7711252-kube-api-access-kh26s\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.481393 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-config\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.481497 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-sb\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.481673 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-nb\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.481728 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-svc\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.481812 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-swift-storage-0\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.583683 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-sb\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.583760 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-nb\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.584123 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-svc\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.584178 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-swift-storage-0\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.584221 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-openstack-edpm-ipam\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.584263 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh26s\" (UniqueName: \"kubernetes.io/projected/35c811c6-3da1-42ec-a2e0-78afb7711252-kube-api-access-kh26s\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.584430 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-config\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.585256 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-config\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.585405 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-openstack-edpm-ipam\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.585482 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-svc\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.585979 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-swift-storage-0\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.586268 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-sb\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.586426 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-nb\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.629579 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh26s\" (UniqueName: \"kubernetes.io/projected/35c811c6-3da1-42ec-a2e0-78afb7711252-kube-api-access-kh26s\") pod \"dnsmasq-dns-dc7c944bf-gf5gz\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:02 crc kubenswrapper[4751]: I1203 14:40:02.721777 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:03 crc kubenswrapper[4751]: I1203 14:40:03.191934 4751 generic.go:334] "Generic (PLEG): container finished" podID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerID="a8c83b1d5f1a85ae4bfd25ddb57357a3145271a6c2fb53b90694087de6ba2f1d" exitCode=0 Dec 03 14:40:03 crc kubenswrapper[4751]: I1203 14:40:03.191991 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1da8e9b-0799-4327-9e24-216c4a51fde2","Type":"ContainerDied","Data":"a8c83b1d5f1a85ae4bfd25ddb57357a3145271a6c2fb53b90694087de6ba2f1d"} Dec 03 14:40:11 crc kubenswrapper[4751]: I1203 14:40:11.212635 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Dec 03 14:40:11 crc kubenswrapper[4751]: I1203 14:40:11.213571 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.769805 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.842378 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz4z2\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-kube-api-access-bz4z2\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.842684 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-plugins-conf\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.842824 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b63367-ad69-4428-9c79-8eee86b817ac-erlang-cookie-secret\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.842954 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-plugins\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.843058 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-confd\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.843165 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-tls\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.883025 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.883262 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-erlang-cookie\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.883450 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-server-conf\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.883584 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b63367-ad69-4428-9c79-8eee86b817ac-pod-info\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.883897 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-config-data\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") " Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.885564 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.885739 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.885986 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.900210 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b63367-ad69-4428-9c79-8eee86b817ac-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.987194 4751 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.987612 4751 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b63367-ad69-4428-9c79-8eee86b817ac-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.987626 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:13 crc kubenswrapper[4751]: I1203 14:40:13.987640 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.322813 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"47b63367-ad69-4428-9c79-8eee86b817ac","Type":"ContainerDied","Data":"92cf114a0ab17f58ea03830a26be5b6cc88351062787af4415153fa0b39929da"} Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.322874 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.323209 4751 scope.go:117] "RemoveContainer" containerID="e31623ebc89c82e42314f91e2333489512ed54bb1026dab75b7fff94b80c1f8a" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.670355 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/47b63367-ad69-4428-9c79-8eee86b817ac-pod-info" (OuterVolumeSpecName: "pod-info") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.670440 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-config-data" (OuterVolumeSpecName: "config-data") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.670792 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.670979 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-kube-api-access-bz4z2" (OuterVolumeSpecName: "kube-api-access-bz4z2") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "kube-api-access-bz4z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.671053 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.671410 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-server-conf" (OuterVolumeSpecName: "server-conf") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:14 crc kubenswrapper[4751]: E1203 14:40:14.701648 4751 reconciler_common.go:156] "operationExecutor.UnmountVolume failed (controllerAttachDetachEnabled true) for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") : UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/47b63367-ad69-4428-9c79-8eee86b817ac/volumes/kubernetes.io~csi/pvc-303f7ec4-fbf6-4051-991e-c91365045b60/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/47b63367-ad69-4428-9c79-8eee86b817ac/volumes/kubernetes.io~csi/pvc-303f7ec4-fbf6-4051-991e-c91365045b60/vol_data.json]: open /var/lib/kubelet/pods/47b63367-ad69-4428-9c79-8eee86b817ac/volumes/kubernetes.io~csi/pvc-303f7ec4-fbf6-4051-991e-c91365045b60/vol_data.json: no such file or directory" err="UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"47b63367-ad69-4428-9c79-8eee86b817ac\" (UID: \"47b63367-ad69-4428-9c79-8eee86b817ac\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/47b63367-ad69-4428-9c79-8eee86b817ac/volumes/kubernetes.io~csi/pvc-303f7ec4-fbf6-4051-991e-c91365045b60/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/47b63367-ad69-4428-9c79-8eee86b817ac/volumes/kubernetes.io~csi/pvc-303f7ec4-fbf6-4051-991e-c91365045b60/vol_data.json]: open /var/lib/kubelet/pods/47b63367-ad69-4428-9c79-8eee86b817ac/volumes/kubernetes.io~csi/pvc-303f7ec4-fbf6-4051-991e-c91365045b60/vol_data.json: no such file or directory" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.702873 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz4z2\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-kube-api-access-bz4z2\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.703047 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.703185 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b63367-ad69-4428-9c79-8eee86b817ac-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.703309 4751 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.703624 4751 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b63367-ad69-4428-9c79-8eee86b817ac-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.703766 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b63367-ad69-4428-9c79-8eee86b817ac-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.746119 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60" (OuterVolumeSpecName: "persistence") pod "47b63367-ad69-4428-9c79-8eee86b817ac" (UID: "47b63367-ad69-4428-9c79-8eee86b817ac"). InnerVolumeSpecName "pvc-303f7ec4-fbf6-4051-991e-c91365045b60". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.805652 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") on node \"crc\" " Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.884220 4751 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.884388 4751 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-303f7ec4-fbf6-4051-991e-c91365045b60" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60") on node "crc" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.907917 4751 reconciler_common.go:293] "Volume detached for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.969280 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:40:14 crc kubenswrapper[4751]: I1203 14:40:14.982537 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.192702 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:40:15 crc kubenswrapper[4751]: E1203 14:40:15.193180 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" containerName="setup-container" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.193203 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" containerName="setup-container" Dec 03 14:40:15 crc kubenswrapper[4751]: E1203 14:40:15.193227 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" containerName="rabbitmq" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.193234 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" containerName="rabbitmq" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.193480 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" containerName="rabbitmq" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.195102 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.199206 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.199727 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.199804 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.199741 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4kzr6" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.200032 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.200036 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.201220 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.238952 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315083 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315189 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4760c776-9212-42af-8bf2-928c79417922-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315245 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315296 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4760c776-9212-42af-8bf2-928c79417922-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315375 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315512 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315559 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4760c776-9212-42af-8bf2-928c79417922-config-data\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315589 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftgnf\" (UniqueName: \"kubernetes.io/projected/4760c776-9212-42af-8bf2-928c79417922-kube-api-access-ftgnf\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315615 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315683 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4760c776-9212-42af-8bf2-928c79417922-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.315748 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4760c776-9212-42af-8bf2-928c79417922-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.337678 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" path="/var/lib/kubelet/pods/47b63367-ad69-4428-9c79-8eee86b817ac/volumes" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.417667 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.418164 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.418951 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4760c776-9212-42af-8bf2-928c79417922-config-data\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.419840 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftgnf\" (UniqueName: \"kubernetes.io/projected/4760c776-9212-42af-8bf2-928c79417922-kube-api-access-ftgnf\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.420280 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.419791 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4760c776-9212-42af-8bf2-928c79417922-config-data\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.420623 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.420956 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4760c776-9212-42af-8bf2-928c79417922-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.422016 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4760c776-9212-42af-8bf2-928c79417922-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.422635 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4760c776-9212-42af-8bf2-928c79417922-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.422756 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.422883 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4760c776-9212-42af-8bf2-928c79417922-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.422987 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.423074 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4760c776-9212-42af-8bf2-928c79417922-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.423237 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.423813 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4760c776-9212-42af-8bf2-928c79417922-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.424194 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.426766 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4760c776-9212-42af-8bf2-928c79417922-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.427025 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4760c776-9212-42af-8bf2-928c79417922-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.427398 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4760c776-9212-42af-8bf2-928c79417922-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.427909 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.428001 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db3758ae0881ee46f2f95476f1e867b818829daba4b9b83c7181ebfa4809f516/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.455257 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftgnf\" (UniqueName: \"kubernetes.io/projected/4760c776-9212-42af-8bf2-928c79417922-kube-api-access-ftgnf\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.820771 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-303f7ec4-fbf6-4051-991e-c91365045b60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-303f7ec4-fbf6-4051-991e-c91365045b60\") pod \"rabbitmq-server-0\" (UID: \"4760c776-9212-42af-8bf2-928c79417922\") " pod="openstack/rabbitmq-server-0" Dec 03 14:40:15 crc kubenswrapper[4751]: I1203 14:40:15.853432 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="47b63367-ad69-4428-9c79-8eee86b817ac" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: i/o timeout" Dec 03 14:40:16 crc kubenswrapper[4751]: I1203 14:40:16.112636 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.202115 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338250 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338299 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-config-data\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338412 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-server-conf\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338494 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-erlang-cookie\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338549 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1da8e9b-0799-4327-9e24-216c4a51fde2-erlang-cookie-secret\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338585 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-plugins\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338619 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-confd\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338677 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-tls\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338735 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1da8e9b-0799-4327-9e24-216c4a51fde2-pod-info\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338841 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn2xq\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-kube-api-access-zn2xq\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.338903 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-plugins-conf\") pod \"e1da8e9b-0799-4327-9e24-216c4a51fde2\" (UID: \"e1da8e9b-0799-4327-9e24-216c4a51fde2\") " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.339576 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.340024 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.341349 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.346756 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1da8e9b-0799-4327-9e24-216c4a51fde2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.346782 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e1da8e9b-0799-4327-9e24-216c4a51fde2-pod-info" (OuterVolumeSpecName: "pod-info") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.357147 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.357620 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-kube-api-access-zn2xq" (OuterVolumeSpecName: "kube-api-access-zn2xq") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "kube-api-access-zn2xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.361445 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6" (OuterVolumeSpecName: "persistence") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "pvc-cd51951f-d36a-49ac-969d-1444603d75a6". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.372835 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-config-data" (OuterVolumeSpecName: "config-data") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.402612 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-server-conf" (OuterVolumeSpecName: "server-conf") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.402830 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1da8e9b-0799-4327-9e24-216c4a51fde2","Type":"ContainerDied","Data":"716307fec7da66fb79e65eeee577d64d532055f9324eb1c9e6b495b8e85e94df"} Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.402884 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441525 4751 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441593 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") on node \"crc\" " Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441611 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441625 4751 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1da8e9b-0799-4327-9e24-216c4a51fde2-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441637 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441651 4751 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1da8e9b-0799-4327-9e24-216c4a51fde2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441665 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441676 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441688 4751 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1da8e9b-0799-4327-9e24-216c4a51fde2-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.441700 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn2xq\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-kube-api-access-zn2xq\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.462522 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e1da8e9b-0799-4327-9e24-216c4a51fde2" (UID: "e1da8e9b-0799-4327-9e24-216c4a51fde2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.471724 4751 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.471924 4751 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-cd51951f-d36a-49ac-969d-1444603d75a6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6") on node "crc" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.542642 4751 reconciler_common.go:293] "Volume detached for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.542693 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1da8e9b-0799-4327-9e24-216c4a51fde2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[4751]: E1203 14:40:19.721545 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Dec 03 14:40:19 crc kubenswrapper[4751]: E1203 14:40:19.721595 4751 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Dec 03 14:40:19 crc kubenswrapper[4751]: E1203 14:40:19.721728 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-znnqk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-m2bmn_openstack(acdd8764-947f-44ce-a5bd-4f3c139d581c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:40:19 crc kubenswrapper[4751]: E1203 14:40:19.723083 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-m2bmn" podUID="acdd8764-947f-44ce-a5bd-4f3c139d581c" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.755393 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.774461 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.792029 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:40:19 crc kubenswrapper[4751]: E1203 14:40:19.793492 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerName="setup-container" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.793516 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerName="setup-container" Dec 03 14:40:19 crc kubenswrapper[4751]: E1203 14:40:19.793558 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerName="rabbitmq" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.793567 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerName="rabbitmq" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.793834 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" containerName="rabbitmq" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.795954 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.798621 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.798819 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.798980 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.799466 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.799534 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.799697 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-df6ct" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.801433 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.804356 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.952736 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5fd5425-70e4-4a79-8ea7-3326cae3908d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.952820 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5fd5425-70e4-4a79-8ea7-3326cae3908d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.952880 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.952911 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5fd5425-70e4-4a79-8ea7-3326cae3908d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.952945 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.953146 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq48h\" (UniqueName: \"kubernetes.io/projected/d5fd5425-70e4-4a79-8ea7-3326cae3908d-kube-api-access-rq48h\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.953309 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5fd5425-70e4-4a79-8ea7-3326cae3908d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.953481 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.953535 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.953633 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5fd5425-70e4-4a79-8ea7-3326cae3908d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:19 crc kubenswrapper[4751]: I1203 14:40:19.953670 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.055252 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.055319 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5fd5425-70e4-4a79-8ea7-3326cae3908d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.055378 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.055444 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq48h\" (UniqueName: \"kubernetes.io/projected/d5fd5425-70e4-4a79-8ea7-3326cae3908d-kube-api-access-rq48h\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.055912 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.055944 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.056208 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5fd5425-70e4-4a79-8ea7-3326cae3908d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.056394 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.056432 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.056517 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5fd5425-70e4-4a79-8ea7-3326cae3908d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.056537 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.056595 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5fd5425-70e4-4a79-8ea7-3326cae3908d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.056937 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5fd5425-70e4-4a79-8ea7-3326cae3908d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.057262 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5fd5425-70e4-4a79-8ea7-3326cae3908d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.058124 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5fd5425-70e4-4a79-8ea7-3326cae3908d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.058180 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5fd5425-70e4-4a79-8ea7-3326cae3908d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.062098 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.062142 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ee16707d3681dd52a4653a253214830682d6f3bd45ca60a3e117a974c1854fca/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.062167 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5fd5425-70e4-4a79-8ea7-3326cae3908d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.062296 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5fd5425-70e4-4a79-8ea7-3326cae3908d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.062791 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.064688 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5fd5425-70e4-4a79-8ea7-3326cae3908d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.073415 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq48h\" (UniqueName: \"kubernetes.io/projected/d5fd5425-70e4-4a79-8ea7-3326cae3908d-kube-api-access-rq48h\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.101414 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd51951f-d36a-49ac-969d-1444603d75a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd51951f-d36a-49ac-969d-1444603d75a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"d5fd5425-70e4-4a79-8ea7-3326cae3908d\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.126621 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.171161 4751 scope.go:117] "RemoveContainer" containerID="dc5a97df8731908a544fde7ee0074065ca6236a194a83bc37dc69e7162e2641e" Dec 03 14:40:20 crc kubenswrapper[4751]: E1203 14:40:20.198509 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 03 14:40:20 crc kubenswrapper[4751]: E1203 14:40:20.198565 4751 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 03 14:40:20 crc kubenswrapper[4751]: E1203 14:40:20.198689 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n547h574h5ddh8dhdfh59chb8h64bh5ffh57ch84hb7h55dh68ch8fh65fh677h5b8h5fbh567hc8h698hb6h77h544h549h58h7fhdfhch7chcdq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f58f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.249108 4751 scope.go:117] "RemoveContainer" containerID="a8c83b1d5f1a85ae4bfd25ddb57357a3145271a6c2fb53b90694087de6ba2f1d" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.429317 4751 scope.go:117] "RemoveContainer" containerID="5f35e9d06325ea3615762c01e04cf56308efaa687e0631b5a77396e8c64782f7" Dec 03 14:40:20 crc kubenswrapper[4751]: E1203 14:40:20.436425 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested\\\"\"" pod="openstack/cloudkitty-db-sync-m2bmn" podUID="acdd8764-947f-44ce-a5bd-4f3c139d581c" Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.720027 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-gf5gz"] Dec 03 14:40:20 crc kubenswrapper[4751]: W1203 14:40:20.724437 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35c811c6_3da1_42ec_a2e0_78afb7711252.slice/crio-47df04945183e332f185c276ffa490c878f909038bb54eb678dbdb8f93d13843 WatchSource:0}: Error finding container 47df04945183e332f185c276ffa490c878f909038bb54eb678dbdb8f93d13843: Status 404 returned error can't find the container with id 47df04945183e332f185c276ffa490c878f909038bb54eb678dbdb8f93d13843 Dec 03 14:40:20 crc kubenswrapper[4751]: W1203 14:40:20.847498 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5fd5425_70e4_4a79_8ea7_3326cae3908d.slice/crio-4a7890e560e4d8dfa93b444ea3b1fe3954783921a3992d2ed5fbd4a742485409 WatchSource:0}: Error finding container 4a7890e560e4d8dfa93b444ea3b1fe3954783921a3992d2ed5fbd4a742485409: Status 404 returned error can't find the container with id 4a7890e560e4d8dfa93b444ea3b1fe3954783921a3992d2ed5fbd4a742485409 Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.852484 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:40:20 crc kubenswrapper[4751]: I1203 14:40:20.863717 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:40:20 crc kubenswrapper[4751]: W1203 14:40:20.866169 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4760c776_9212_42af_8bf2_928c79417922.slice/crio-f051aa30cf9ef09f3c4a6249f0d31e43fe01432aaff3a2f76b7d17d62ea42a81 WatchSource:0}: Error finding container f051aa30cf9ef09f3c4a6249f0d31e43fe01432aaff3a2f76b7d17d62ea42a81: Status 404 returned error can't find the container with id f051aa30cf9ef09f3c4a6249f0d31e43fe01432aaff3a2f76b7d17d62ea42a81 Dec 03 14:40:21 crc kubenswrapper[4751]: I1203 14:40:21.340720 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1da8e9b-0799-4327-9e24-216c4a51fde2" path="/var/lib/kubelet/pods/e1da8e9b-0799-4327-9e24-216c4a51fde2/volumes" Dec 03 14:40:21 crc kubenswrapper[4751]: I1203 14:40:21.444380 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4760c776-9212-42af-8bf2-928c79417922","Type":"ContainerStarted","Data":"f051aa30cf9ef09f3c4a6249f0d31e43fe01432aaff3a2f76b7d17d62ea42a81"} Dec 03 14:40:21 crc kubenswrapper[4751]: I1203 14:40:21.451484 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991","Type":"ContainerStarted","Data":"513f69a8a3af975c03de90408fdf4a4a794083da84aca38a3196c9b31a655521"} Dec 03 14:40:21 crc kubenswrapper[4751]: I1203 14:40:21.455173 4751 generic.go:334] "Generic (PLEG): container finished" podID="35c811c6-3da1-42ec-a2e0-78afb7711252" containerID="f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920" exitCode=0 Dec 03 14:40:21 crc kubenswrapper[4751]: I1203 14:40:21.455230 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" event={"ID":"35c811c6-3da1-42ec-a2e0-78afb7711252","Type":"ContainerDied","Data":"f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920"} Dec 03 14:40:21 crc kubenswrapper[4751]: I1203 14:40:21.455252 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" event={"ID":"35c811c6-3da1-42ec-a2e0-78afb7711252","Type":"ContainerStarted","Data":"47df04945183e332f185c276ffa490c878f909038bb54eb678dbdb8f93d13843"} Dec 03 14:40:21 crc kubenswrapper[4751]: I1203 14:40:21.458749 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d5fd5425-70e4-4a79-8ea7-3326cae3908d","Type":"ContainerStarted","Data":"4a7890e560e4d8dfa93b444ea3b1fe3954783921a3992d2ed5fbd4a742485409"} Dec 03 14:40:22 crc kubenswrapper[4751]: I1203 14:40:22.472928 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991","Type":"ContainerStarted","Data":"28d1b1357456a852a8847db036da26da583b5b87a889bf39751ee610a9a7d099"} Dec 03 14:40:22 crc kubenswrapper[4751]: I1203 14:40:22.475533 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" event={"ID":"35c811c6-3da1-42ec-a2e0-78afb7711252","Type":"ContainerStarted","Data":"3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce"} Dec 03 14:40:22 crc kubenswrapper[4751]: I1203 14:40:22.475670 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:22 crc kubenswrapper[4751]: I1203 14:40:22.499338 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" podStartSLOduration=20.499305325999998 podStartE2EDuration="20.499305326s" podCreationTimestamp="2025-12-03 14:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:40:22.492959844 +0000 UTC m=+1629.481315061" watchObservedRunningTime="2025-12-03 14:40:22.499305326 +0000 UTC m=+1629.487660543" Dec 03 14:40:23 crc kubenswrapper[4751]: E1203 14:40:23.172366 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" Dec 03 14:40:23 crc kubenswrapper[4751]: I1203 14:40:23.492022 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4760c776-9212-42af-8bf2-928c79417922","Type":"ContainerStarted","Data":"a9a61c8d66a8e3560cc611afcc95fc6409e45746f4e74ff2f502db7e3ccfd7e4"} Dec 03 14:40:23 crc kubenswrapper[4751]: I1203 14:40:23.499156 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991","Type":"ContainerStarted","Data":"799b2462fd24565add91b6d49f4f7f14bc6b2c66205754fde7d7a6c4ef438896"} Dec 03 14:40:23 crc kubenswrapper[4751]: I1203 14:40:23.502630 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d5fd5425-70e4-4a79-8ea7-3326cae3908d","Type":"ContainerStarted","Data":"3a3e35c139f0ebc5543a5bce979a4196e8bdae6d84fef27b6163ea481e57d247"} Dec 03 14:40:23 crc kubenswrapper[4751]: E1203 14:40:23.503692 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" Dec 03 14:40:24 crc kubenswrapper[4751]: I1203 14:40:24.513534 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:40:24 crc kubenswrapper[4751]: E1203 14:40:24.514104 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" Dec 03 14:40:25 crc kubenswrapper[4751]: E1203 14:40:25.526908 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" Dec 03 14:40:27 crc kubenswrapper[4751]: I1203 14:40:27.724483 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:27 crc kubenswrapper[4751]: I1203 14:40:27.796815 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-glbss"] Dec 03 14:40:27 crc kubenswrapper[4751]: I1203 14:40:27.797137 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54dd998c-glbss" podUID="8481969b-2092-4b8c-9a57-9f83972d0997" containerName="dnsmasq-dns" containerID="cri-o://6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c" gracePeriod=10 Dec 03 14:40:27 crc kubenswrapper[4751]: I1203 14:40:27.978644 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c4b758ff5-hjq5l"] Dec 03 14:40:27 crc kubenswrapper[4751]: I1203 14:40:27.981271 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.007458 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4b758ff5-hjq5l"] Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.141101 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-openstack-edpm-ipam\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.141239 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-ovsdbserver-sb\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.141392 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-dns-svc\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.141454 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-config\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.141952 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5dnn\" (UniqueName: \"kubernetes.io/projected/941e6cf3-002b-476c-8347-dfc11a32b067-kube-api-access-m5dnn\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.142024 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-ovsdbserver-nb\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.142280 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-dns-swift-storage-0\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.249064 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-ovsdbserver-sb\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.249358 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-dns-svc\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.249418 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-config\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.249559 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5dnn\" (UniqueName: \"kubernetes.io/projected/941e6cf3-002b-476c-8347-dfc11a32b067-kube-api-access-m5dnn\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.249597 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-ovsdbserver-nb\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.249676 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-dns-swift-storage-0\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.249755 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-openstack-edpm-ipam\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.250157 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-ovsdbserver-sb\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.250353 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-dns-svc\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.250415 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-config\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.250700 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-openstack-edpm-ipam\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.250873 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-ovsdbserver-nb\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.250997 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/941e6cf3-002b-476c-8347-dfc11a32b067-dns-swift-storage-0\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.279836 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5dnn\" (UniqueName: \"kubernetes.io/projected/941e6cf3-002b-476c-8347-dfc11a32b067-kube-api-access-m5dnn\") pod \"dnsmasq-dns-c4b758ff5-hjq5l\" (UID: \"941e6cf3-002b-476c-8347-dfc11a32b067\") " pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.304338 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.511870 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.556967 4751 generic.go:334] "Generic (PLEG): container finished" podID="8481969b-2092-4b8c-9a57-9f83972d0997" containerID="6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c" exitCode=0 Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.557249 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-glbss" event={"ID":"8481969b-2092-4b8c-9a57-9f83972d0997","Type":"ContainerDied","Data":"6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c"} Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.557369 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-glbss" event={"ID":"8481969b-2092-4b8c-9a57-9f83972d0997","Type":"ContainerDied","Data":"fd343a6666b1fb43ca535ad571a21407d8ce81bda3f667bd00a7b665045cd773"} Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.557493 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dd998c-glbss" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.557503 4751 scope.go:117] "RemoveContainer" containerID="6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.611064 4751 scope.go:117] "RemoveContainer" containerID="6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.636009 4751 scope.go:117] "RemoveContainer" containerID="6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c" Dec 03 14:40:28 crc kubenswrapper[4751]: E1203 14:40:28.636564 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c\": container with ID starting with 6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c not found: ID does not exist" containerID="6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.636619 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c"} err="failed to get container status \"6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c\": rpc error: code = NotFound desc = could not find container \"6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c\": container with ID starting with 6de6c6aa98bde00ac9c3b5900850b5e4ad63b65ad7ddd09237c20a155560191c not found: ID does not exist" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.636653 4751 scope.go:117] "RemoveContainer" containerID="6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca" Dec 03 14:40:28 crc kubenswrapper[4751]: E1203 14:40:28.637166 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca\": container with ID starting with 6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca not found: ID does not exist" containerID="6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.637193 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca"} err="failed to get container status \"6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca\": rpc error: code = NotFound desc = could not find container \"6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca\": container with ID starting with 6607fad5f11063f6ca36fb380ba79923ed4a863415c9076733af94fec9a507ca not found: ID does not exist" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.659693 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-swift-storage-0\") pod \"8481969b-2092-4b8c-9a57-9f83972d0997\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.659850 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-nb\") pod \"8481969b-2092-4b8c-9a57-9f83972d0997\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.659986 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-sb\") pod \"8481969b-2092-4b8c-9a57-9f83972d0997\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.660072 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-svc\") pod \"8481969b-2092-4b8c-9a57-9f83972d0997\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.660108 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-config\") pod \"8481969b-2092-4b8c-9a57-9f83972d0997\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.660175 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzfz6\" (UniqueName: \"kubernetes.io/projected/8481969b-2092-4b8c-9a57-9f83972d0997-kube-api-access-mzfz6\") pod \"8481969b-2092-4b8c-9a57-9f83972d0997\" (UID: \"8481969b-2092-4b8c-9a57-9f83972d0997\") " Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.669520 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8481969b-2092-4b8c-9a57-9f83972d0997-kube-api-access-mzfz6" (OuterVolumeSpecName: "kube-api-access-mzfz6") pod "8481969b-2092-4b8c-9a57-9f83972d0997" (UID: "8481969b-2092-4b8c-9a57-9f83972d0997"). InnerVolumeSpecName "kube-api-access-mzfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.722557 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8481969b-2092-4b8c-9a57-9f83972d0997" (UID: "8481969b-2092-4b8c-9a57-9f83972d0997"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.723953 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8481969b-2092-4b8c-9a57-9f83972d0997" (UID: "8481969b-2092-4b8c-9a57-9f83972d0997"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.726588 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8481969b-2092-4b8c-9a57-9f83972d0997" (UID: "8481969b-2092-4b8c-9a57-9f83972d0997"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.729731 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8481969b-2092-4b8c-9a57-9f83972d0997" (UID: "8481969b-2092-4b8c-9a57-9f83972d0997"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.739541 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-config" (OuterVolumeSpecName: "config") pod "8481969b-2092-4b8c-9a57-9f83972d0997" (UID: "8481969b-2092-4b8c-9a57-9f83972d0997"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.764405 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.764439 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.764449 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.764459 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzfz6\" (UniqueName: \"kubernetes.io/projected/8481969b-2092-4b8c-9a57-9f83972d0997-kube-api-access-mzfz6\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.764471 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.764480 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8481969b-2092-4b8c-9a57-9f83972d0997-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.824240 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4b758ff5-hjq5l"] Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.896384 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-glbss"] Dec 03 14:40:28 crc kubenswrapper[4751]: I1203 14:40:28.916820 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-glbss"] Dec 03 14:40:29 crc kubenswrapper[4751]: I1203 14:40:29.339140 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8481969b-2092-4b8c-9a57-9f83972d0997" path="/var/lib/kubelet/pods/8481969b-2092-4b8c-9a57-9f83972d0997/volumes" Dec 03 14:40:29 crc kubenswrapper[4751]: I1203 14:40:29.568462 4751 generic.go:334] "Generic (PLEG): container finished" podID="941e6cf3-002b-476c-8347-dfc11a32b067" containerID="19f76790e17a2c6027109b5b613e78006382a9144ef4954d95296bf0dd72cac0" exitCode=0 Dec 03 14:40:29 crc kubenswrapper[4751]: I1203 14:40:29.568536 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" event={"ID":"941e6cf3-002b-476c-8347-dfc11a32b067","Type":"ContainerDied","Data":"19f76790e17a2c6027109b5b613e78006382a9144ef4954d95296bf0dd72cac0"} Dec 03 14:40:29 crc kubenswrapper[4751]: I1203 14:40:29.568566 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" event={"ID":"941e6cf3-002b-476c-8347-dfc11a32b067","Type":"ContainerStarted","Data":"8880aedcc5ebd0e7af4f46f0b18647bc2393635faeb026d6a3ab23f42744d1f6"} Dec 03 14:40:30 crc kubenswrapper[4751]: I1203 14:40:30.582704 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" event={"ID":"941e6cf3-002b-476c-8347-dfc11a32b067","Type":"ContainerStarted","Data":"4de92eb2a99df60286e08871e244b26489c1ffe28bacac716830111b9829f0c7"} Dec 03 14:40:30 crc kubenswrapper[4751]: I1203 14:40:30.583200 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:30 crc kubenswrapper[4751]: I1203 14:40:30.608812 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" podStartSLOduration=3.608792966 podStartE2EDuration="3.608792966s" podCreationTimestamp="2025-12-03 14:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:40:30.598152217 +0000 UTC m=+1637.586507434" watchObservedRunningTime="2025-12-03 14:40:30.608792966 +0000 UTC m=+1637.597148193" Dec 03 14:40:34 crc kubenswrapper[4751]: I1203 14:40:34.843147 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 14:40:35 crc kubenswrapper[4751]: I1203 14:40:35.643016 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-m2bmn" event={"ID":"acdd8764-947f-44ce-a5bd-4f3c139d581c","Type":"ContainerStarted","Data":"7e02254cca8413a97e385dc466a9fb31f3a4c304fbee5d201a8c28057a264138"} Dec 03 14:40:35 crc kubenswrapper[4751]: I1203 14:40:35.666424 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-m2bmn" podStartSLOduration=2.593183236 podStartE2EDuration="56.66640429s" podCreationTimestamp="2025-12-03 14:39:39 +0000 UTC" firstStartedPulling="2025-12-03 14:39:40.765764711 +0000 UTC m=+1587.754119928" lastFinishedPulling="2025-12-03 14:40:34.838985765 +0000 UTC m=+1641.827340982" observedRunningTime="2025-12-03 14:40:35.659699718 +0000 UTC m=+1642.648054955" watchObservedRunningTime="2025-12-03 14:40:35.66640429 +0000 UTC m=+1642.654759507" Dec 03 14:40:35 crc kubenswrapper[4751]: I1203 14:40:35.821796 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:40:35 crc kubenswrapper[4751]: I1203 14:40:35.821856 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:40:38 crc kubenswrapper[4751]: I1203 14:40:38.306245 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c4b758ff5-hjq5l" Dec 03 14:40:38 crc kubenswrapper[4751]: I1203 14:40:38.386059 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-gf5gz"] Dec 03 14:40:38 crc kubenswrapper[4751]: I1203 14:40:38.386429 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" podUID="35c811c6-3da1-42ec-a2e0-78afb7711252" containerName="dnsmasq-dns" containerID="cri-o://3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce" gracePeriod=10 Dec 03 14:40:38 crc kubenswrapper[4751]: I1203 14:40:38.671002 4751 generic.go:334] "Generic (PLEG): container finished" podID="acdd8764-947f-44ce-a5bd-4f3c139d581c" containerID="7e02254cca8413a97e385dc466a9fb31f3a4c304fbee5d201a8c28057a264138" exitCode=0 Dec 03 14:40:38 crc kubenswrapper[4751]: I1203 14:40:38.671223 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-m2bmn" event={"ID":"acdd8764-947f-44ce-a5bd-4f3c139d581c","Type":"ContainerDied","Data":"7e02254cca8413a97e385dc466a9fb31f3a4c304fbee5d201a8c28057a264138"} Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.479654 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.626306 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-sb\") pod \"35c811c6-3da1-42ec-a2e0-78afb7711252\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.626705 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-nb\") pod \"35c811c6-3da1-42ec-a2e0-78afb7711252\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.626870 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-svc\") pod \"35c811c6-3da1-42ec-a2e0-78afb7711252\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.626903 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-openstack-edpm-ipam\") pod \"35c811c6-3da1-42ec-a2e0-78afb7711252\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.626938 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-swift-storage-0\") pod \"35c811c6-3da1-42ec-a2e0-78afb7711252\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.627020 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-config\") pod \"35c811c6-3da1-42ec-a2e0-78afb7711252\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.627046 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh26s\" (UniqueName: \"kubernetes.io/projected/35c811c6-3da1-42ec-a2e0-78afb7711252-kube-api-access-kh26s\") pod \"35c811c6-3da1-42ec-a2e0-78afb7711252\" (UID: \"35c811c6-3da1-42ec-a2e0-78afb7711252\") " Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.634306 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c811c6-3da1-42ec-a2e0-78afb7711252-kube-api-access-kh26s" (OuterVolumeSpecName: "kube-api-access-kh26s") pod "35c811c6-3da1-42ec-a2e0-78afb7711252" (UID: "35c811c6-3da1-42ec-a2e0-78afb7711252"). InnerVolumeSpecName "kube-api-access-kh26s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.685717 4751 generic.go:334] "Generic (PLEG): container finished" podID="35c811c6-3da1-42ec-a2e0-78afb7711252" containerID="3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce" exitCode=0 Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.685931 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.686857 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" event={"ID":"35c811c6-3da1-42ec-a2e0-78afb7711252","Type":"ContainerDied","Data":"3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce"} Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.686891 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-gf5gz" event={"ID":"35c811c6-3da1-42ec-a2e0-78afb7711252","Type":"ContainerDied","Data":"47df04945183e332f185c276ffa490c878f909038bb54eb678dbdb8f93d13843"} Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.686907 4751 scope.go:117] "RemoveContainer" containerID="3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.697394 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "35c811c6-3da1-42ec-a2e0-78afb7711252" (UID: "35c811c6-3da1-42ec-a2e0-78afb7711252"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.698472 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "35c811c6-3da1-42ec-a2e0-78afb7711252" (UID: "35c811c6-3da1-42ec-a2e0-78afb7711252"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.699557 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35c811c6-3da1-42ec-a2e0-78afb7711252" (UID: "35c811c6-3da1-42ec-a2e0-78afb7711252"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.711928 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "35c811c6-3da1-42ec-a2e0-78afb7711252" (UID: "35c811c6-3da1-42ec-a2e0-78afb7711252"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.723725 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-config" (OuterVolumeSpecName: "config") pod "35c811c6-3da1-42ec-a2e0-78afb7711252" (UID: "35c811c6-3da1-42ec-a2e0-78afb7711252"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.729619 4751 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.729648 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.729658 4751 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.729666 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.729675 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh26s\" (UniqueName: \"kubernetes.io/projected/35c811c6-3da1-42ec-a2e0-78afb7711252-kube-api-access-kh26s\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.729683 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.762529 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "35c811c6-3da1-42ec-a2e0-78afb7711252" (UID: "35c811c6-3da1-42ec-a2e0-78afb7711252"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.831955 4751 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35c811c6-3da1-42ec-a2e0-78afb7711252-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.863411 4751 scope.go:117] "RemoveContainer" containerID="f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.916545 4751 scope.go:117] "RemoveContainer" containerID="3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce" Dec 03 14:40:39 crc kubenswrapper[4751]: E1203 14:40:39.928435 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce\": container with ID starting with 3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce not found: ID does not exist" containerID="3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.928485 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce"} err="failed to get container status \"3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce\": rpc error: code = NotFound desc = could not find container \"3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce\": container with ID starting with 3f3e760cf3eaf775b67c78a432568faf845840f8dbb215e81f6281adba8780ce not found: ID does not exist" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.928511 4751 scope.go:117] "RemoveContainer" containerID="f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920" Dec 03 14:40:39 crc kubenswrapper[4751]: E1203 14:40:39.928947 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920\": container with ID starting with f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920 not found: ID does not exist" containerID="f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920" Dec 03 14:40:39 crc kubenswrapper[4751]: I1203 14:40:39.928968 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920"} err="failed to get container status \"f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920\": rpc error: code = NotFound desc = could not find container \"f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920\": container with ID starting with f290b968b362b86d0e7067c5e786600bad9ca833582ce9539419239bb7a8c920 not found: ID does not exist" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.031305 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-gf5gz"] Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.042412 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-gf5gz"] Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.171908 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.341134 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znnqk\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-kube-api-access-znnqk\") pod \"acdd8764-947f-44ce-a5bd-4f3c139d581c\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.341202 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-config-data\") pod \"acdd8764-947f-44ce-a5bd-4f3c139d581c\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.341227 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-scripts\") pod \"acdd8764-947f-44ce-a5bd-4f3c139d581c\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.341382 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-certs\") pod \"acdd8764-947f-44ce-a5bd-4f3c139d581c\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.341487 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-combined-ca-bundle\") pod \"acdd8764-947f-44ce-a5bd-4f3c139d581c\" (UID: \"acdd8764-947f-44ce-a5bd-4f3c139d581c\") " Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.346278 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-kube-api-access-znnqk" (OuterVolumeSpecName: "kube-api-access-znnqk") pod "acdd8764-947f-44ce-a5bd-4f3c139d581c" (UID: "acdd8764-947f-44ce-a5bd-4f3c139d581c"). InnerVolumeSpecName "kube-api-access-znnqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.346886 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-scripts" (OuterVolumeSpecName: "scripts") pod "acdd8764-947f-44ce-a5bd-4f3c139d581c" (UID: "acdd8764-947f-44ce-a5bd-4f3c139d581c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.347701 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.350495 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-certs" (OuterVolumeSpecName: "certs") pod "acdd8764-947f-44ce-a5bd-4f3c139d581c" (UID: "acdd8764-947f-44ce-a5bd-4f3c139d581c"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.379678 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-config-data" (OuterVolumeSpecName: "config-data") pod "acdd8764-947f-44ce-a5bd-4f3c139d581c" (UID: "acdd8764-947f-44ce-a5bd-4f3c139d581c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.395790 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acdd8764-947f-44ce-a5bd-4f3c139d581c" (UID: "acdd8764-947f-44ce-a5bd-4f3c139d581c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.444480 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.444507 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znnqk\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-kube-api-access-znnqk\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.444516 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.444524 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acdd8764-947f-44ce-a5bd-4f3c139d581c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.444533 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acdd8764-947f-44ce-a5bd-4f3c139d581c-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.697083 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-m2bmn" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.697287 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-m2bmn" event={"ID":"acdd8764-947f-44ce-a5bd-4f3c139d581c","Type":"ContainerDied","Data":"9a20d373fcbcb3c4c7a8e9e94f422c3624a279146b94b28f36b3440d62f6565b"} Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.697666 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a20d373fcbcb3c4c7a8e9e94f422c3624a279146b94b28f36b3440d62f6565b" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.874462 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-n9kjc"] Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.900701 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-n9kjc"] Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.998690 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-wh7hc"] Dec 03 14:40:40 crc kubenswrapper[4751]: E1203 14:40:40.999387 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8481969b-2092-4b8c-9a57-9f83972d0997" containerName="dnsmasq-dns" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.999456 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8481969b-2092-4b8c-9a57-9f83972d0997" containerName="dnsmasq-dns" Dec 03 14:40:40 crc kubenswrapper[4751]: E1203 14:40:40.999511 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8481969b-2092-4b8c-9a57-9f83972d0997" containerName="init" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.999575 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8481969b-2092-4b8c-9a57-9f83972d0997" containerName="init" Dec 03 14:40:40 crc kubenswrapper[4751]: E1203 14:40:40.999633 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c811c6-3da1-42ec-a2e0-78afb7711252" containerName="init" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.999689 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c811c6-3da1-42ec-a2e0-78afb7711252" containerName="init" Dec 03 14:40:40 crc kubenswrapper[4751]: E1203 14:40:40.999747 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c811c6-3da1-42ec-a2e0-78afb7711252" containerName="dnsmasq-dns" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.999795 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c811c6-3da1-42ec-a2e0-78afb7711252" containerName="dnsmasq-dns" Dec 03 14:40:40 crc kubenswrapper[4751]: E1203 14:40:40.999861 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acdd8764-947f-44ce-a5bd-4f3c139d581c" containerName="cloudkitty-db-sync" Dec 03 14:40:40 crc kubenswrapper[4751]: I1203 14:40:40.999922 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="acdd8764-947f-44ce-a5bd-4f3c139d581c" containerName="cloudkitty-db-sync" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.000165 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c811c6-3da1-42ec-a2e0-78afb7711252" containerName="dnsmasq-dns" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.000242 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8481969b-2092-4b8c-9a57-9f83972d0997" containerName="dnsmasq-dns" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.000312 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="acdd8764-947f-44ce-a5bd-4f3c139d581c" containerName="cloudkitty-db-sync" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.001162 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.003435 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.011928 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-wh7hc"] Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.158357 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-certs\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.158484 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-combined-ca-bundle\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.158518 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-scripts\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.158556 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-config-data\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.158678 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v4kz\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-kube-api-access-8v4kz\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.261297 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-combined-ca-bundle\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.261376 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v4kz\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-kube-api-access-8v4kz\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.261399 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-scripts\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.261438 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-config-data\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.261616 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-certs\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.266648 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-scripts\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.267001 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-combined-ca-bundle\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.283946 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-certs\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.284376 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-config-data\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.287937 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v4kz\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-kube-api-access-8v4kz\") pod \"cloudkitty-storageinit-wh7hc\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.325275 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c811c6-3da1-42ec-a2e0-78afb7711252" path="/var/lib/kubelet/pods/35c811c6-3da1-42ec-a2e0-78afb7711252/volumes" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.326245 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8729b9a9-11fa-432d-bc45-09172fc6bbc7" path="/var/lib/kubelet/pods/8729b9a9-11fa-432d-bc45-09172fc6bbc7/volumes" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.333873 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.713602 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991","Type":"ContainerStarted","Data":"2d91066c11cbed6564b693a73e7f5a2d9d6cc97d9fe2616162ad8832a58832ed"} Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.743553 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.957841423 podStartE2EDuration="54.743529906s" podCreationTimestamp="2025-12-03 14:39:47 +0000 UTC" firstStartedPulling="2025-12-03 14:39:47.924765233 +0000 UTC m=+1594.913120450" lastFinishedPulling="2025-12-03 14:40:40.710453716 +0000 UTC m=+1647.698808933" observedRunningTime="2025-12-03 14:40:41.7348373 +0000 UTC m=+1648.723192517" watchObservedRunningTime="2025-12-03 14:40:41.743529906 +0000 UTC m=+1648.731885123" Dec 03 14:40:41 crc kubenswrapper[4751]: I1203 14:40:41.854485 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-wh7hc"] Dec 03 14:40:42 crc kubenswrapper[4751]: I1203 14:40:42.725104 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-wh7hc" event={"ID":"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a","Type":"ContainerStarted","Data":"5862a570a9ea02901bac005c94807dcf0903e1f886130ea96b24dfe06cedf64f"} Dec 03 14:40:42 crc kubenswrapper[4751]: I1203 14:40:42.725471 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-wh7hc" event={"ID":"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a","Type":"ContainerStarted","Data":"10e8af557ff3d5fe71b3d535cdff383fbbf3f28dec92045477a47807a188d648"} Dec 03 14:40:42 crc kubenswrapper[4751]: I1203 14:40:42.746858 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-wh7hc" podStartSLOduration=2.746834908 podStartE2EDuration="2.746834908s" podCreationTimestamp="2025-12-03 14:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:40:42.738860141 +0000 UTC m=+1649.727215358" watchObservedRunningTime="2025-12-03 14:40:42.746834908 +0000 UTC m=+1649.735190135" Dec 03 14:40:44 crc kubenswrapper[4751]: I1203 14:40:44.746625 4751 generic.go:334] "Generic (PLEG): container finished" podID="4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" containerID="5862a570a9ea02901bac005c94807dcf0903e1f886130ea96b24dfe06cedf64f" exitCode=0 Dec 03 14:40:44 crc kubenswrapper[4751]: I1203 14:40:44.746702 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-wh7hc" event={"ID":"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a","Type":"ContainerDied","Data":"5862a570a9ea02901bac005c94807dcf0903e1f886130ea96b24dfe06cedf64f"} Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.321770 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.476904 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-scripts\") pod \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.477108 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v4kz\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-kube-api-access-8v4kz\") pod \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.477452 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-combined-ca-bundle\") pod \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.477727 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-certs\") pod \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.477814 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-config-data\") pod \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\" (UID: \"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a\") " Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.489564 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-kube-api-access-8v4kz" (OuterVolumeSpecName: "kube-api-access-8v4kz") pod "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" (UID: "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a"). InnerVolumeSpecName "kube-api-access-8v4kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.490277 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-certs" (OuterVolumeSpecName: "certs") pod "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" (UID: "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.490434 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-scripts" (OuterVolumeSpecName: "scripts") pod "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" (UID: "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.512400 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-config-data" (OuterVolumeSpecName: "config-data") pod "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" (UID: "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.517828 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" (UID: "4c5f4b94-eb6c-4ad0-b12f-237f3d87396a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.581181 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v4kz\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-kube-api-access-8v4kz\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.581230 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.581241 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.581251 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.581265 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.773870 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-wh7hc" event={"ID":"4c5f4b94-eb6c-4ad0-b12f-237f3d87396a","Type":"ContainerDied","Data":"10e8af557ff3d5fe71b3d535cdff383fbbf3f28dec92045477a47807a188d648"} Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.773932 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e8af557ff3d5fe71b3d535cdff383fbbf3f28dec92045477a47807a188d648" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.773928 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-wh7hc" Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.884465 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.884704 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" containerName="cloudkitty-proc" containerID="cri-o://9902f8a9bdb54179cac523b0ab0c5b1136aa809f84825250647d3c33bdeeb0a7" gracePeriod=30 Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.904805 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.905050 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerName="cloudkitty-api-log" containerID="cri-o://b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938" gracePeriod=30 Dec 03 14:40:46 crc kubenswrapper[4751]: I1203 14:40:46.905094 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerName="cloudkitty-api" containerID="cri-o://8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959" gracePeriod=30 Dec 03 14:40:47 crc kubenswrapper[4751]: I1203 14:40:47.786587 4751 generic.go:334] "Generic (PLEG): container finished" podID="14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" containerID="9902f8a9bdb54179cac523b0ab0c5b1136aa809f84825250647d3c33bdeeb0a7" exitCode=0 Dec 03 14:40:47 crc kubenswrapper[4751]: I1203 14:40:47.787572 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d","Type":"ContainerDied","Data":"9902f8a9bdb54179cac523b0ab0c5b1136aa809f84825250647d3c33bdeeb0a7"} Dec 03 14:40:47 crc kubenswrapper[4751]: I1203 14:40:47.791298 4751 generic.go:334] "Generic (PLEG): container finished" podID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerID="b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938" exitCode=143 Dec 03 14:40:47 crc kubenswrapper[4751]: I1203 14:40:47.791357 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52f84a98-0911-42a1-a4f3-7858eb75ea86","Type":"ContainerDied","Data":"b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938"} Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.298148 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.435970 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-scripts\") pod \"52f84a98-0911-42a1-a4f3-7858eb75ea86\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.436032 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntzlv\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-kube-api-access-ntzlv\") pod \"52f84a98-0911-42a1-a4f3-7858eb75ea86\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.436074 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-combined-ca-bundle\") pod \"52f84a98-0911-42a1-a4f3-7858eb75ea86\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.436153 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f84a98-0911-42a1-a4f3-7858eb75ea86-logs\") pod \"52f84a98-0911-42a1-a4f3-7858eb75ea86\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.436211 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-certs\") pod \"52f84a98-0911-42a1-a4f3-7858eb75ea86\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.436264 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-public-tls-certs\") pod \"52f84a98-0911-42a1-a4f3-7858eb75ea86\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.436287 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data\") pod \"52f84a98-0911-42a1-a4f3-7858eb75ea86\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.436313 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data-custom\") pod \"52f84a98-0911-42a1-a4f3-7858eb75ea86\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.436566 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-internal-tls-certs\") pod \"52f84a98-0911-42a1-a4f3-7858eb75ea86\" (UID: \"52f84a98-0911-42a1-a4f3-7858eb75ea86\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.436650 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52f84a98-0911-42a1-a4f3-7858eb75ea86-logs" (OuterVolumeSpecName: "logs") pod "52f84a98-0911-42a1-a4f3-7858eb75ea86" (UID: "52f84a98-0911-42a1-a4f3-7858eb75ea86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.437093 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f84a98-0911-42a1-a4f3-7858eb75ea86-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.442571 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "52f84a98-0911-42a1-a4f3-7858eb75ea86" (UID: "52f84a98-0911-42a1-a4f3-7858eb75ea86"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.442874 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-scripts" (OuterVolumeSpecName: "scripts") pod "52f84a98-0911-42a1-a4f3-7858eb75ea86" (UID: "52f84a98-0911-42a1-a4f3-7858eb75ea86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.442932 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-kube-api-access-ntzlv" (OuterVolumeSpecName: "kube-api-access-ntzlv") pod "52f84a98-0911-42a1-a4f3-7858eb75ea86" (UID: "52f84a98-0911-42a1-a4f3-7858eb75ea86"). InnerVolumeSpecName "kube-api-access-ntzlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.444268 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-certs" (OuterVolumeSpecName: "certs") pod "52f84a98-0911-42a1-a4f3-7858eb75ea86" (UID: "52f84a98-0911-42a1-a4f3-7858eb75ea86"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.473461 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data" (OuterVolumeSpecName: "config-data") pod "52f84a98-0911-42a1-a4f3-7858eb75ea86" (UID: "52f84a98-0911-42a1-a4f3-7858eb75ea86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.476232 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52f84a98-0911-42a1-a4f3-7858eb75ea86" (UID: "52f84a98-0911-42a1-a4f3-7858eb75ea86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.514222 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52f84a98-0911-42a1-a4f3-7858eb75ea86" (UID: "52f84a98-0911-42a1-a4f3-7858eb75ea86"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.517503 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "52f84a98-0911-42a1-a4f3-7858eb75ea86" (UID: "52f84a98-0911-42a1-a4f3-7858eb75ea86"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.538684 4751 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.538724 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.538736 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntzlv\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-kube-api-access-ntzlv\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.538747 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.538756 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52f84a98-0911-42a1-a4f3-7858eb75ea86-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.538765 4751 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.538774 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.538783 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52f84a98-0911-42a1-a4f3-7858eb75ea86-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.621106 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.742216 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-scripts\") pod \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.742320 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-certs\") pod \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.742442 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data-custom\") pod \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.742461 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-combined-ca-bundle\") pod \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.742477 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data\") pod \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.742568 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg7kj\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-kube-api-access-cg7kj\") pod \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\" (UID: \"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d\") " Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.747221 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-kube-api-access-cg7kj" (OuterVolumeSpecName: "kube-api-access-cg7kj") pod "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" (UID: "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d"). InnerVolumeSpecName "kube-api-access-cg7kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.747498 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" (UID: "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.748001 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-scripts" (OuterVolumeSpecName: "scripts") pod "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" (UID: "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.763976 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-certs" (OuterVolumeSpecName: "certs") pod "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" (UID: "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.780139 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" (UID: "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.782440 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data" (OuterVolumeSpecName: "config-data") pod "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" (UID: "14dda6a6-b4ca-4864-9e15-4eda9a8cb73d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.817642 4751 generic.go:334] "Generic (PLEG): container finished" podID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerID="8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959" exitCode=0 Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.817736 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.817743 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52f84a98-0911-42a1-a4f3-7858eb75ea86","Type":"ContainerDied","Data":"8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959"} Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.818278 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52f84a98-0911-42a1-a4f3-7858eb75ea86","Type":"ContainerDied","Data":"43883af2bb3344d8f1940d8bc373e57dbf99d3f44fcfbebc547d522cc19bfb95"} Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.818306 4751 scope.go:117] "RemoveContainer" containerID="8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.823653 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"14dda6a6-b4ca-4864-9e15-4eda9a8cb73d","Type":"ContainerDied","Data":"95c6e6d638153f5d5ddedd3c60eee77f1daa3c1f711393ca7777ae2a2a238be6"} Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.823702 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.845618 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg7kj\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-kube-api-access-cg7kj\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.845654 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.845666 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.845678 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.845690 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.845700 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.889564 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.896895 4751 scope.go:117] "RemoveContainer" containerID="b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.926835 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.947788 4751 scope.go:117] "RemoveContainer" containerID="8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.947910 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:40:48 crc kubenswrapper[4751]: E1203 14:40:48.953998 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959\": container with ID starting with 8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959 not found: ID does not exist" containerID="8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.954084 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959"} err="failed to get container status \"8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959\": rpc error: code = NotFound desc = could not find container \"8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959\": container with ID starting with 8b5361f1714a7980ef279045418b5f4a79da87eb1b65a2d506a37c93cb5d7959 not found: ID does not exist" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.954134 4751 scope.go:117] "RemoveContainer" containerID="b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938" Dec 03 14:40:48 crc kubenswrapper[4751]: E1203 14:40:48.958126 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938\": container with ID starting with b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938 not found: ID does not exist" containerID="b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.958180 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938"} err="failed to get container status \"b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938\": rpc error: code = NotFound desc = could not find container \"b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938\": container with ID starting with b74b9c4518fb2187117167de7950c84efc7879518ae211eaa67f28da8641e938 not found: ID does not exist" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.958222 4751 scope.go:117] "RemoveContainer" containerID="9902f8a9bdb54179cac523b0ab0c5b1136aa809f84825250647d3c33bdeeb0a7" Dec 03 14:40:48 crc kubenswrapper[4751]: I1203 14:40:48.980251 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.014682 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:40:49 crc kubenswrapper[4751]: E1203 14:40:49.016182 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" containerName="cloudkitty-storageinit" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.016214 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" containerName="cloudkitty-storageinit" Dec 03 14:40:49 crc kubenswrapper[4751]: E1203 14:40:49.016227 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" containerName="cloudkitty-proc" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.016237 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" containerName="cloudkitty-proc" Dec 03 14:40:49 crc kubenswrapper[4751]: E1203 14:40:49.017316 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerName="cloudkitty-api" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.017346 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerName="cloudkitty-api" Dec 03 14:40:49 crc kubenswrapper[4751]: E1203 14:40:49.017368 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerName="cloudkitty-api-log" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.017376 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerName="cloudkitty-api-log" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.019014 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerName="cloudkitty-api" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.019050 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" containerName="cloudkitty-proc" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.019065 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f84a98-0911-42a1-a4f3-7858eb75ea86" containerName="cloudkitty-api-log" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.019089 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" containerName="cloudkitty-storageinit" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.022340 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.027349 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.027595 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-mdxtz" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.027695 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.030128 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.030132 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.031387 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.032038 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.045781 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.059224 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.071157 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.081664 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.125389 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.174803 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-config-data\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.174863 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.174894 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.174921 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.174941 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-config-data\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.174974 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.174987 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.175050 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67g9n\" (UniqueName: \"kubernetes.io/projected/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-kube-api-access-67g9n\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.175074 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b84593-a4d7-4b1c-843a-feb9273afbf4-logs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.175094 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b5b84593-a4d7-4b1c-843a-feb9273afbf4-certs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.175113 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.175133 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-scripts\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.175158 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-scripts\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.175177 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqt2k\" (UniqueName: \"kubernetes.io/projected/b5b84593-a4d7-4b1c-843a-feb9273afbf4-kube-api-access-hqt2k\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.175213 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-certs\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.277172 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-scripts\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.277545 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-scripts\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.277664 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqt2k\" (UniqueName: \"kubernetes.io/projected/b5b84593-a4d7-4b1c-843a-feb9273afbf4-kube-api-access-hqt2k\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.277802 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-certs\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.277938 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-config-data\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.278103 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.278585 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.278698 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.278778 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-config-data\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.278917 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.279033 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.279255 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67g9n\" (UniqueName: \"kubernetes.io/projected/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-kube-api-access-67g9n\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.279404 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b84593-a4d7-4b1c-843a-feb9273afbf4-logs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.279506 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b5b84593-a4d7-4b1c-843a-feb9273afbf4-certs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.279602 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.281981 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b84593-a4d7-4b1c-843a-feb9273afbf4-logs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.282681 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.283851 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.284640 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-scripts\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.284686 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.284998 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-scripts\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.285551 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-certs\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.286616 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.287720 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.288495 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b5b84593-a4d7-4b1c-843a-feb9273afbf4-certs\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.290872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.293584 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-config-data\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.298943 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67g9n\" (UniqueName: \"kubernetes.io/projected/9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a-kube-api-access-67g9n\") pod \"cloudkitty-proc-0\" (UID: \"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a\") " pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.301670 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqt2k\" (UniqueName: \"kubernetes.io/projected/b5b84593-a4d7-4b1c-843a-feb9273afbf4-kube-api-access-hqt2k\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.308912 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b84593-a4d7-4b1c-843a-feb9273afbf4-config-data\") pod \"cloudkitty-api-0\" (UID: \"b5b84593-a4d7-4b1c-843a-feb9273afbf4\") " pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.332478 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14dda6a6-b4ca-4864-9e15-4eda9a8cb73d" path="/var/lib/kubelet/pods/14dda6a6-b4ca-4864-9e15-4eda9a8cb73d/volumes" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.333407 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f84a98-0911-42a1-a4f3-7858eb75ea86" path="/var/lib/kubelet/pods/52f84a98-0911-42a1-a4f3-7858eb75ea86/volumes" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.373616 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.393873 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 14:40:49 crc kubenswrapper[4751]: I1203 14:40:49.922002 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 14:40:50 crc kubenswrapper[4751]: I1203 14:40:50.011823 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 14:40:50 crc kubenswrapper[4751]: W1203 14:40:50.014880 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ae01ad1_cb7e_4bad_9505_57b4f85d7d3a.slice/crio-34b58fc516744fa3e96da64663a04c81cf7856936f9f51c00282b774963b9e9c WatchSource:0}: Error finding container 34b58fc516744fa3e96da64663a04c81cf7856936f9f51c00282b774963b9e9c: Status 404 returned error can't find the container with id 34b58fc516744fa3e96da64663a04c81cf7856936f9f51c00282b774963b9e9c Dec 03 14:40:50 crc kubenswrapper[4751]: I1203 14:40:50.901880 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b5b84593-a4d7-4b1c-843a-feb9273afbf4","Type":"ContainerStarted","Data":"db06b880da9bab9e427b6808164135cc6d1b03a8f42261c0c606ceb3549dc97e"} Dec 03 14:40:50 crc kubenswrapper[4751]: I1203 14:40:50.901935 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b5b84593-a4d7-4b1c-843a-feb9273afbf4","Type":"ContainerStarted","Data":"deac28ba4d0c31f4f7583a5e6d41b620675647bcef5846cd4e0dcd5c6d852b91"} Dec 03 14:40:50 crc kubenswrapper[4751]: I1203 14:40:50.901945 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b5b84593-a4d7-4b1c-843a-feb9273afbf4","Type":"ContainerStarted","Data":"be89776ffabbf2045bff483580a05ae5f865b6facd30efbbd032431b0b524ded"} Dec 03 14:40:50 crc kubenswrapper[4751]: I1203 14:40:50.901985 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Dec 03 14:40:50 crc kubenswrapper[4751]: I1203 14:40:50.905978 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a","Type":"ContainerStarted","Data":"34b58fc516744fa3e96da64663a04c81cf7856936f9f51c00282b774963b9e9c"} Dec 03 14:40:50 crc kubenswrapper[4751]: I1203 14:40:50.931396 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.931374857 podStartE2EDuration="2.931374857s" podCreationTimestamp="2025-12-03 14:40:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:40:50.921753405 +0000 UTC m=+1657.910108622" watchObservedRunningTime="2025-12-03 14:40:50.931374857 +0000 UTC m=+1657.919730074" Dec 03 14:40:51 crc kubenswrapper[4751]: I1203 14:40:51.917947 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a","Type":"ContainerStarted","Data":"7198d8c21a22984a0f89e499fd573ee735c2a23c2fffe79946b1ac159647d70e"} Dec 03 14:40:51 crc kubenswrapper[4751]: I1203 14:40:51.943054 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.924267234 podStartE2EDuration="3.943035456s" podCreationTimestamp="2025-12-03 14:40:48 +0000 UTC" firstStartedPulling="2025-12-03 14:40:50.01810878 +0000 UTC m=+1657.006463997" lastFinishedPulling="2025-12-03 14:40:51.036877002 +0000 UTC m=+1658.025232219" observedRunningTime="2025-12-03 14:40:51.93472476 +0000 UTC m=+1658.923079977" watchObservedRunningTime="2025-12-03 14:40:51.943035456 +0000 UTC m=+1658.931390673" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.118053 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt"] Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.119854 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.137707 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.137848 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.142579 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.143922 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.155629 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt"] Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.243658 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.243716 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5wkw\" (UniqueName: \"kubernetes.io/projected/f38ae118-11a0-4c72-9d0b-750762779ee7-kube-api-access-s5wkw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.243739 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.243855 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.346206 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.346388 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.346422 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5wkw\" (UniqueName: \"kubernetes.io/projected/f38ae118-11a0-4c72-9d0b-750762779ee7-kube-api-access-s5wkw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.346444 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.354201 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.354361 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.371666 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.372294 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5wkw\" (UniqueName: \"kubernetes.io/projected/f38ae118-11a0-4c72-9d0b-750762779ee7-kube-api-access-s5wkw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:52 crc kubenswrapper[4751]: I1203 14:40:52.442961 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:40:54 crc kubenswrapper[4751]: I1203 14:40:54.192206 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt"] Dec 03 14:40:54 crc kubenswrapper[4751]: W1203 14:40:54.194753 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf38ae118_11a0_4c72_9d0b_750762779ee7.slice/crio-4a28a8ae6dfeeaad06ee85cbdf48d89513c1c281d92dca1e2ba77d08ecd81a82 WatchSource:0}: Error finding container 4a28a8ae6dfeeaad06ee85cbdf48d89513c1c281d92dca1e2ba77d08ecd81a82: Status 404 returned error can't find the container with id 4a28a8ae6dfeeaad06ee85cbdf48d89513c1c281d92dca1e2ba77d08ecd81a82 Dec 03 14:40:54 crc kubenswrapper[4751]: I1203 14:40:54.956010 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" event={"ID":"f38ae118-11a0-4c72-9d0b-750762779ee7","Type":"ContainerStarted","Data":"4a28a8ae6dfeeaad06ee85cbdf48d89513c1c281d92dca1e2ba77d08ecd81a82"} Dec 03 14:40:54 crc kubenswrapper[4751]: I1203 14:40:54.958077 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5fd5425-70e4-4a79-8ea7-3326cae3908d" containerID="3a3e35c139f0ebc5543a5bce979a4196e8bdae6d84fef27b6163ea481e57d247" exitCode=0 Dec 03 14:40:54 crc kubenswrapper[4751]: I1203 14:40:54.958157 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d5fd5425-70e4-4a79-8ea7-3326cae3908d","Type":"ContainerDied","Data":"3a3e35c139f0ebc5543a5bce979a4196e8bdae6d84fef27b6163ea481e57d247"} Dec 03 14:40:55 crc kubenswrapper[4751]: I1203 14:40:55.970212 4751 generic.go:334] "Generic (PLEG): container finished" podID="4760c776-9212-42af-8bf2-928c79417922" containerID="a9a61c8d66a8e3560cc611afcc95fc6409e45746f4e74ff2f502db7e3ccfd7e4" exitCode=0 Dec 03 14:40:55 crc kubenswrapper[4751]: I1203 14:40:55.970292 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4760c776-9212-42af-8bf2-928c79417922","Type":"ContainerDied","Data":"a9a61c8d66a8e3560cc611afcc95fc6409e45746f4e74ff2f502db7e3ccfd7e4"} Dec 03 14:40:55 crc kubenswrapper[4751]: I1203 14:40:55.987279 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d5fd5425-70e4-4a79-8ea7-3326cae3908d","Type":"ContainerStarted","Data":"bb4e333a79830f8579106e6e0757d89767ba950b74a11b51af4e4827156e7bba"} Dec 03 14:40:55 crc kubenswrapper[4751]: I1203 14:40:55.988318 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:40:56 crc kubenswrapper[4751]: I1203 14:40:56.055978 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.05595392 podStartE2EDuration="37.05595392s" podCreationTimestamp="2025-12-03 14:40:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:40:56.046665437 +0000 UTC m=+1663.035020654" watchObservedRunningTime="2025-12-03 14:40:56.05595392 +0000 UTC m=+1663.044309157" Dec 03 14:40:57 crc kubenswrapper[4751]: I1203 14:40:57.011018 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4760c776-9212-42af-8bf2-928c79417922","Type":"ContainerStarted","Data":"599d9b4434923df62030a5d393f5fd7d9cc21c3cbcbc7e0cb200f070eabddbe4"} Dec 03 14:40:57 crc kubenswrapper[4751]: I1203 14:40:57.011589 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 14:40:57 crc kubenswrapper[4751]: I1203 14:40:57.060503 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=43.060484345 podStartE2EDuration="43.060484345s" podCreationTimestamp="2025-12-03 14:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:40:57.057908165 +0000 UTC m=+1664.046263402" watchObservedRunningTime="2025-12-03 14:40:57.060484345 +0000 UTC m=+1664.048839552" Dec 03 14:41:05 crc kubenswrapper[4751]: I1203 14:41:05.819835 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:41:05 crc kubenswrapper[4751]: I1203 14:41:05.820605 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:41:06 crc kubenswrapper[4751]: I1203 14:41:06.114767 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4760c776-9212-42af-8bf2-928c79417922" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.238:5671: connect: connection refused" Dec 03 14:41:07 crc kubenswrapper[4751]: I1203 14:41:07.161578 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" event={"ID":"f38ae118-11a0-4c72-9d0b-750762779ee7","Type":"ContainerStarted","Data":"9610ed1c5042c2ee46a1b856c7ffe5997852daf146bc4df18b247cffc4603155"} Dec 03 14:41:07 crc kubenswrapper[4751]: I1203 14:41:07.182837 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" podStartSLOduration=3.032221383 podStartE2EDuration="15.182812896s" podCreationTimestamp="2025-12-03 14:40:52 +0000 UTC" firstStartedPulling="2025-12-03 14:40:54.197205043 +0000 UTC m=+1661.185560260" lastFinishedPulling="2025-12-03 14:41:06.347796556 +0000 UTC m=+1673.336151773" observedRunningTime="2025-12-03 14:41:07.17888887 +0000 UTC m=+1674.167244087" watchObservedRunningTime="2025-12-03 14:41:07.182812896 +0000 UTC m=+1674.171168123" Dec 03 14:41:10 crc kubenswrapper[4751]: I1203 14:41:10.131569 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:41:16 crc kubenswrapper[4751]: I1203 14:41:16.116504 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 14:41:18 crc kubenswrapper[4751]: I1203 14:41:18.290583 4751 generic.go:334] "Generic (PLEG): container finished" podID="f38ae118-11a0-4c72-9d0b-750762779ee7" containerID="9610ed1c5042c2ee46a1b856c7ffe5997852daf146bc4df18b247cffc4603155" exitCode=0 Dec 03 14:41:18 crc kubenswrapper[4751]: I1203 14:41:18.290637 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" event={"ID":"f38ae118-11a0-4c72-9d0b-750762779ee7","Type":"ContainerDied","Data":"9610ed1c5042c2ee46a1b856c7ffe5997852daf146bc4df18b247cffc4603155"} Dec 03 14:41:19 crc kubenswrapper[4751]: I1203 14:41:19.846966 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:41:19 crc kubenswrapper[4751]: I1203 14:41:19.988108 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-ssh-key\") pod \"f38ae118-11a0-4c72-9d0b-750762779ee7\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " Dec 03 14:41:19 crc kubenswrapper[4751]: I1203 14:41:19.988164 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5wkw\" (UniqueName: \"kubernetes.io/projected/f38ae118-11a0-4c72-9d0b-750762779ee7-kube-api-access-s5wkw\") pod \"f38ae118-11a0-4c72-9d0b-750762779ee7\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " Dec 03 14:41:19 crc kubenswrapper[4751]: I1203 14:41:19.988346 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-repo-setup-combined-ca-bundle\") pod \"f38ae118-11a0-4c72-9d0b-750762779ee7\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " Dec 03 14:41:19 crc kubenswrapper[4751]: I1203 14:41:19.988384 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-inventory\") pod \"f38ae118-11a0-4c72-9d0b-750762779ee7\" (UID: \"f38ae118-11a0-4c72-9d0b-750762779ee7\") " Dec 03 14:41:19 crc kubenswrapper[4751]: I1203 14:41:19.996038 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f38ae118-11a0-4c72-9d0b-750762779ee7" (UID: "f38ae118-11a0-4c72-9d0b-750762779ee7"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:19 crc kubenswrapper[4751]: I1203 14:41:19.996097 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38ae118-11a0-4c72-9d0b-750762779ee7-kube-api-access-s5wkw" (OuterVolumeSpecName: "kube-api-access-s5wkw") pod "f38ae118-11a0-4c72-9d0b-750762779ee7" (UID: "f38ae118-11a0-4c72-9d0b-750762779ee7"). InnerVolumeSpecName "kube-api-access-s5wkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.022966 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-inventory" (OuterVolumeSpecName: "inventory") pod "f38ae118-11a0-4c72-9d0b-750762779ee7" (UID: "f38ae118-11a0-4c72-9d0b-750762779ee7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.027970 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f38ae118-11a0-4c72-9d0b-750762779ee7" (UID: "f38ae118-11a0-4c72-9d0b-750762779ee7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.091154 4751 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.091202 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.091215 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f38ae118-11a0-4c72-9d0b-750762779ee7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.091227 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5wkw\" (UniqueName: \"kubernetes.io/projected/f38ae118-11a0-4c72-9d0b-750762779ee7-kube-api-access-s5wkw\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.291899 4751 scope.go:117] "RemoveContainer" containerID="249a102ad07502d29f0d959a39bdbf77b8cd09f8449eba408c5c841af432f912" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.313744 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" event={"ID":"f38ae118-11a0-4c72-9d0b-750762779ee7","Type":"ContainerDied","Data":"4a28a8ae6dfeeaad06ee85cbdf48d89513c1c281d92dca1e2ba77d08ecd81a82"} Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.313783 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a28a8ae6dfeeaad06ee85cbdf48d89513c1c281d92dca1e2ba77d08ecd81a82" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.313830 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.330255 4751 scope.go:117] "RemoveContainer" containerID="fd4ad307c6770e8a8b0b6725bcb1ec36e7b9ae146d3b55050e11d998dcbff3ff" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.400868 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn"] Dec 03 14:41:20 crc kubenswrapper[4751]: E1203 14:41:20.401443 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38ae118-11a0-4c72-9d0b-750762779ee7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.401467 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38ae118-11a0-4c72-9d0b-750762779ee7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.401755 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f38ae118-11a0-4c72-9d0b-750762779ee7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.402622 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.405482 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.405845 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.405969 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.406087 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.413120 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn"] Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.499855 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w96tn\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.500361 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w96tn\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.500683 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp94m\" (UniqueName: \"kubernetes.io/projected/228ac9f7-0635-4a38-8d51-038e9a588a7d-kube-api-access-kp94m\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w96tn\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.602255 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp94m\" (UniqueName: \"kubernetes.io/projected/228ac9f7-0635-4a38-8d51-038e9a588a7d-kube-api-access-kp94m\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w96tn\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.602426 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w96tn\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.603433 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w96tn\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.608189 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w96tn\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.608311 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w96tn\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.622422 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp94m\" (UniqueName: \"kubernetes.io/projected/228ac9f7-0635-4a38-8d51-038e9a588a7d-kube-api-access-kp94m\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w96tn\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:20 crc kubenswrapper[4751]: I1203 14:41:20.731911 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:21 crc kubenswrapper[4751]: I1203 14:41:21.285395 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn"] Dec 03 14:41:21 crc kubenswrapper[4751]: W1203 14:41:21.312906 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod228ac9f7_0635_4a38_8d51_038e9a588a7d.slice/crio-8e4519d18602dc47b473c6b7ac5c94a01e7bd91844a42b3810e519012df6b6f6 WatchSource:0}: Error finding container 8e4519d18602dc47b473c6b7ac5c94a01e7bd91844a42b3810e519012df6b6f6: Status 404 returned error can't find the container with id 8e4519d18602dc47b473c6b7ac5c94a01e7bd91844a42b3810e519012df6b6f6 Dec 03 14:41:21 crc kubenswrapper[4751]: I1203 14:41:21.323606 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:41:21 crc kubenswrapper[4751]: I1203 14:41:21.343138 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" event={"ID":"228ac9f7-0635-4a38-8d51-038e9a588a7d","Type":"ContainerStarted","Data":"8e4519d18602dc47b473c6b7ac5c94a01e7bd91844a42b3810e519012df6b6f6"} Dec 03 14:41:22 crc kubenswrapper[4751]: I1203 14:41:22.355735 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" event={"ID":"228ac9f7-0635-4a38-8d51-038e9a588a7d","Type":"ContainerStarted","Data":"a0917a58796f477862ef20e200b96e41a0b058f6ca16e8e220efd67d9c45bb2c"} Dec 03 14:41:22 crc kubenswrapper[4751]: I1203 14:41:22.388679 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" podStartSLOduration=1.887420143 podStartE2EDuration="2.388651517s" podCreationTimestamp="2025-12-03 14:41:20 +0000 UTC" firstStartedPulling="2025-12-03 14:41:21.323349921 +0000 UTC m=+1688.311705148" lastFinishedPulling="2025-12-03 14:41:21.824581305 +0000 UTC m=+1688.812936522" observedRunningTime="2025-12-03 14:41:22.373738102 +0000 UTC m=+1689.362093339" watchObservedRunningTime="2025-12-03 14:41:22.388651517 +0000 UTC m=+1689.377006734" Dec 03 14:41:25 crc kubenswrapper[4751]: I1203 14:41:25.397066 4751 generic.go:334] "Generic (PLEG): container finished" podID="228ac9f7-0635-4a38-8d51-038e9a588a7d" containerID="a0917a58796f477862ef20e200b96e41a0b058f6ca16e8e220efd67d9c45bb2c" exitCode=0 Dec 03 14:41:25 crc kubenswrapper[4751]: I1203 14:41:25.397165 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" event={"ID":"228ac9f7-0635-4a38-8d51-038e9a588a7d","Type":"ContainerDied","Data":"a0917a58796f477862ef20e200b96e41a0b058f6ca16e8e220efd67d9c45bb2c"} Dec 03 14:41:26 crc kubenswrapper[4751]: I1203 14:41:26.473804 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Dec 03 14:41:26 crc kubenswrapper[4751]: I1203 14:41:26.994447 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.142821 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-inventory\") pod \"228ac9f7-0635-4a38-8d51-038e9a588a7d\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.142923 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-ssh-key\") pod \"228ac9f7-0635-4a38-8d51-038e9a588a7d\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.142991 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp94m\" (UniqueName: \"kubernetes.io/projected/228ac9f7-0635-4a38-8d51-038e9a588a7d-kube-api-access-kp94m\") pod \"228ac9f7-0635-4a38-8d51-038e9a588a7d\" (UID: \"228ac9f7-0635-4a38-8d51-038e9a588a7d\") " Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.148515 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228ac9f7-0635-4a38-8d51-038e9a588a7d-kube-api-access-kp94m" (OuterVolumeSpecName: "kube-api-access-kp94m") pod "228ac9f7-0635-4a38-8d51-038e9a588a7d" (UID: "228ac9f7-0635-4a38-8d51-038e9a588a7d"). InnerVolumeSpecName "kube-api-access-kp94m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.174183 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "228ac9f7-0635-4a38-8d51-038e9a588a7d" (UID: "228ac9f7-0635-4a38-8d51-038e9a588a7d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.175892 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-inventory" (OuterVolumeSpecName: "inventory") pod "228ac9f7-0635-4a38-8d51-038e9a588a7d" (UID: "228ac9f7-0635-4a38-8d51-038e9a588a7d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.245645 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.245687 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/228ac9f7-0635-4a38-8d51-038e9a588a7d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.245700 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp94m\" (UniqueName: \"kubernetes.io/projected/228ac9f7-0635-4a38-8d51-038e9a588a7d-kube-api-access-kp94m\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.422958 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" event={"ID":"228ac9f7-0635-4a38-8d51-038e9a588a7d","Type":"ContainerDied","Data":"8e4519d18602dc47b473c6b7ac5c94a01e7bd91844a42b3810e519012df6b6f6"} Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.422994 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e4519d18602dc47b473c6b7ac5c94a01e7bd91844a42b3810e519012df6b6f6" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.423019 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w96tn" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.522400 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k"] Dec 03 14:41:27 crc kubenswrapper[4751]: E1203 14:41:27.523175 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228ac9f7-0635-4a38-8d51-038e9a588a7d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.523191 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="228ac9f7-0635-4a38-8d51-038e9a588a7d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.523431 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="228ac9f7-0635-4a38-8d51-038e9a588a7d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.524166 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.527163 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.527614 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.527809 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.528018 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.535270 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k"] Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.656995 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.657214 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n78b4\" (UniqueName: \"kubernetes.io/projected/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-kube-api-access-n78b4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.657276 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.657313 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.759529 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n78b4\" (UniqueName: \"kubernetes.io/projected/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-kube-api-access-n78b4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.759627 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.759657 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.759740 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.764203 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.764311 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.766229 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.782510 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n78b4\" (UniqueName: \"kubernetes.io/projected/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-kube-api-access-n78b4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:27 crc kubenswrapper[4751]: I1203 14:41:27.859314 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:41:28 crc kubenswrapper[4751]: I1203 14:41:28.413947 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k"] Dec 03 14:41:28 crc kubenswrapper[4751]: I1203 14:41:28.434651 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" event={"ID":"cd7a02af-abd1-4669-88f3-7e1d1117d8e9","Type":"ContainerStarted","Data":"73d91f85e74dc43880a40f1b66bf418db23ba4b7c6d077db6e9d4963af4a313a"} Dec 03 14:41:29 crc kubenswrapper[4751]: I1203 14:41:29.446390 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" event={"ID":"cd7a02af-abd1-4669-88f3-7e1d1117d8e9","Type":"ContainerStarted","Data":"8b39d38e44caf8f0b98b1d7b2cd1064376393b14e42d9d419ace6352a2e87bea"} Dec 03 14:41:29 crc kubenswrapper[4751]: I1203 14:41:29.465966 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" podStartSLOduration=1.759999104 podStartE2EDuration="2.46594561s" podCreationTimestamp="2025-12-03 14:41:27 +0000 UTC" firstStartedPulling="2025-12-03 14:41:28.404594241 +0000 UTC m=+1695.392949458" lastFinishedPulling="2025-12-03 14:41:29.110540747 +0000 UTC m=+1696.098895964" observedRunningTime="2025-12-03 14:41:29.464581183 +0000 UTC m=+1696.452936400" watchObservedRunningTime="2025-12-03 14:41:29.46594561 +0000 UTC m=+1696.454300827" Dec 03 14:41:35 crc kubenswrapper[4751]: I1203 14:41:35.820317 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:41:35 crc kubenswrapper[4751]: I1203 14:41:35.820910 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:41:35 crc kubenswrapper[4751]: I1203 14:41:35.820987 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:41:35 crc kubenswrapper[4751]: I1203 14:41:35.822011 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:41:35 crc kubenswrapper[4751]: I1203 14:41:35.822086 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" gracePeriod=600 Dec 03 14:41:36 crc kubenswrapper[4751]: E1203 14:41:36.449146 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:41:36 crc kubenswrapper[4751]: I1203 14:41:36.537626 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" exitCode=0 Dec 03 14:41:36 crc kubenswrapper[4751]: I1203 14:41:36.537675 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45"} Dec 03 14:41:36 crc kubenswrapper[4751]: I1203 14:41:36.537707 4751 scope.go:117] "RemoveContainer" containerID="8513ef227e39ef06a8d05cad17c9635fc3ec8cf5ec5acd20288a621754b77ca6" Dec 03 14:41:36 crc kubenswrapper[4751]: I1203 14:41:36.538365 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:41:36 crc kubenswrapper[4751]: E1203 14:41:36.538611 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:41:52 crc kubenswrapper[4751]: I1203 14:41:52.314437 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:41:52 crc kubenswrapper[4751]: E1203 14:41:52.315271 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:42:04 crc kubenswrapper[4751]: I1203 14:42:04.314102 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:42:04 crc kubenswrapper[4751]: E1203 14:42:04.314828 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:42:18 crc kubenswrapper[4751]: I1203 14:42:18.313829 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:42:18 crc kubenswrapper[4751]: E1203 14:42:18.314722 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:42:20 crc kubenswrapper[4751]: I1203 14:42:20.716135 4751 scope.go:117] "RemoveContainer" containerID="f74eebf05afb03acb40dab92e5dd6d14e9c474658599c36667cf2b6599a9e351" Dec 03 14:42:29 crc kubenswrapper[4751]: I1203 14:42:29.314142 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:42:29 crc kubenswrapper[4751]: E1203 14:42:29.314908 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:42:43 crc kubenswrapper[4751]: I1203 14:42:43.321780 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:42:43 crc kubenswrapper[4751]: E1203 14:42:43.323757 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:42:56 crc kubenswrapper[4751]: I1203 14:42:56.315044 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:42:56 crc kubenswrapper[4751]: E1203 14:42:56.315745 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:43:10 crc kubenswrapper[4751]: I1203 14:43:10.313946 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:43:10 crc kubenswrapper[4751]: E1203 14:43:10.314694 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:43:20 crc kubenswrapper[4751]: I1203 14:43:20.779222 4751 scope.go:117] "RemoveContainer" containerID="30534927c508640ffb69a006bdfb4947051d7419544a169322d10dcceef1d7f0" Dec 03 14:43:20 crc kubenswrapper[4751]: I1203 14:43:20.802995 4751 scope.go:117] "RemoveContainer" containerID="36f089dd2c917db47de913ff3ada61cc4274bfa92ef031abbb92e8fe6ccd8daa" Dec 03 14:43:20 crc kubenswrapper[4751]: I1203 14:43:20.829063 4751 scope.go:117] "RemoveContainer" containerID="140d4f247f184cab56abb6c9a63b668e98b54b136f9c22a120bc01761693764f" Dec 03 14:43:20 crc kubenswrapper[4751]: I1203 14:43:20.849427 4751 scope.go:117] "RemoveContainer" containerID="05d5253453438458652023650260f255514ba6d4730c8a11872caedcf77835bb" Dec 03 14:43:20 crc kubenswrapper[4751]: I1203 14:43:20.875463 4751 scope.go:117] "RemoveContainer" containerID="d68df586dcf7a42d897b5e6a35592b0dc3b7f1bcdbf2920ccf45275ac032d959" Dec 03 14:43:20 crc kubenswrapper[4751]: I1203 14:43:20.902550 4751 scope.go:117] "RemoveContainer" containerID="24003421e57923c6199a8e1dfb412c6d702c4649da37c29574a520e9fa6f3964" Dec 03 14:43:20 crc kubenswrapper[4751]: I1203 14:43:20.928756 4751 scope.go:117] "RemoveContainer" containerID="474b632ca34764eba6a79153a2f8678f814f73937e091cbc22f1cda425c0974e" Dec 03 14:43:20 crc kubenswrapper[4751]: I1203 14:43:20.951196 4751 scope.go:117] "RemoveContainer" containerID="9975ca1e958cc4bc981e0f5df51a22e3e4cbb50b6083f294d73b7dcda5e275aa" Dec 03 14:43:20 crc kubenswrapper[4751]: I1203 14:43:20.976589 4751 scope.go:117] "RemoveContainer" containerID="401be6a3772e336862f5c1b68804af0800810cfd0fa47538da48aad7379ec0b4" Dec 03 14:43:24 crc kubenswrapper[4751]: I1203 14:43:24.313904 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:43:24 crc kubenswrapper[4751]: E1203 14:43:24.314828 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:43:35 crc kubenswrapper[4751]: I1203 14:43:35.314302 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:43:35 crc kubenswrapper[4751]: E1203 14:43:35.315107 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:43:48 crc kubenswrapper[4751]: I1203 14:43:48.314787 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:43:48 crc kubenswrapper[4751]: E1203 14:43:48.315501 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:43:59 crc kubenswrapper[4751]: I1203 14:43:59.314708 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:43:59 crc kubenswrapper[4751]: E1203 14:43:59.315681 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:44:12 crc kubenswrapper[4751]: I1203 14:44:12.314955 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:44:12 crc kubenswrapper[4751]: E1203 14:44:12.315740 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:44:21 crc kubenswrapper[4751]: I1203 14:44:21.109447 4751 scope.go:117] "RemoveContainer" containerID="1d1cec24e3ad2c6a8d6216ff4797836ae68fa9b1a682e26ea519c0e7e65b3652" Dec 03 14:44:21 crc kubenswrapper[4751]: I1203 14:44:21.139954 4751 scope.go:117] "RemoveContainer" containerID="5111d494711d3fc952353cff34682b84c22433e9c6b8f15b6005aa2dbc34034f" Dec 03 14:44:21 crc kubenswrapper[4751]: I1203 14:44:21.170076 4751 scope.go:117] "RemoveContainer" containerID="a884400999616e264f0d5bd2debc788723a0602f4baed37efb42d9f33c14d9f7" Dec 03 14:44:21 crc kubenswrapper[4751]: I1203 14:44:21.204813 4751 scope.go:117] "RemoveContainer" containerID="1ba7c83cc6fcae2a6de327eb53ffa20bbea6d19c1be4e78d949dc8a5d4d21393" Dec 03 14:44:22 crc kubenswrapper[4751]: I1203 14:44:22.056006 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-e7ba-account-create-update-nnclp"] Dec 03 14:44:22 crc kubenswrapper[4751]: I1203 14:44:22.070964 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-78w4f"] Dec 03 14:44:22 crc kubenswrapper[4751]: I1203 14:44:22.081066 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-70be-account-create-update-rq9wc"] Dec 03 14:44:22 crc kubenswrapper[4751]: I1203 14:44:22.091224 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-e7ba-account-create-update-nnclp"] Dec 03 14:44:22 crc kubenswrapper[4751]: I1203 14:44:22.103388 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-78w4f"] Dec 03 14:44:22 crc kubenswrapper[4751]: I1203 14:44:22.110967 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-70be-account-create-update-rq9wc"] Dec 03 14:44:22 crc kubenswrapper[4751]: I1203 14:44:22.120055 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rrw72"] Dec 03 14:44:22 crc kubenswrapper[4751]: I1203 14:44:22.129142 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rrw72"] Dec 03 14:44:23 crc kubenswrapper[4751]: I1203 14:44:23.051420 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bddc-account-create-update-zv8f2"] Dec 03 14:44:23 crc kubenswrapper[4751]: I1203 14:44:23.063553 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bddc-account-create-update-zv8f2"] Dec 03 14:44:23 crc kubenswrapper[4751]: I1203 14:44:23.320635 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:44:23 crc kubenswrapper[4751]: E1203 14:44:23.321041 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:44:23 crc kubenswrapper[4751]: I1203 14:44:23.324631 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b70371-2bd9-44dc-a70a-c522ffb2125a" path="/var/lib/kubelet/pods/24b70371-2bd9-44dc-a70a-c522ffb2125a/volumes" Dec 03 14:44:23 crc kubenswrapper[4751]: I1203 14:44:23.325368 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b5d97a-cd82-45be-ae36-bd97f293b7cd" path="/var/lib/kubelet/pods/27b5d97a-cd82-45be-ae36-bd97f293b7cd/volumes" Dec 03 14:44:23 crc kubenswrapper[4751]: I1203 14:44:23.325981 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="638fec0e-031b-4c73-828b-95157a9dd522" path="/var/lib/kubelet/pods/638fec0e-031b-4c73-828b-95157a9dd522/volumes" Dec 03 14:44:23 crc kubenswrapper[4751]: I1203 14:44:23.326863 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5413bb7-fc5f-47b1-b2c1-03b66cab3b92" path="/var/lib/kubelet/pods/d5413bb7-fc5f-47b1-b2c1-03b66cab3b92/volumes" Dec 03 14:44:23 crc kubenswrapper[4751]: I1203 14:44:23.327998 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d58931ec-41c3-417e-b1b6-23d8855a0dbd" path="/var/lib/kubelet/pods/d58931ec-41c3-417e-b1b6-23d8855a0dbd/volumes" Dec 03 14:44:24 crc kubenswrapper[4751]: I1203 14:44:24.027471 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7q6w6"] Dec 03 14:44:24 crc kubenswrapper[4751]: I1203 14:44:24.039552 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7q6w6"] Dec 03 14:44:25 crc kubenswrapper[4751]: I1203 14:44:25.325725 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1389c328-4d38-4b2a-b8ca-54e69fa59035" path="/var/lib/kubelet/pods/1389c328-4d38-4b2a-b8ca-54e69fa59035/volumes" Dec 03 14:44:38 crc kubenswrapper[4751]: I1203 14:44:38.315075 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:44:38 crc kubenswrapper[4751]: E1203 14:44:38.316075 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:44:40 crc kubenswrapper[4751]: I1203 14:44:40.606493 4751 generic.go:334] "Generic (PLEG): container finished" podID="cd7a02af-abd1-4669-88f3-7e1d1117d8e9" containerID="8b39d38e44caf8f0b98b1d7b2cd1064376393b14e42d9d419ace6352a2e87bea" exitCode=0 Dec 03 14:44:40 crc kubenswrapper[4751]: I1203 14:44:40.606581 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" event={"ID":"cd7a02af-abd1-4669-88f3-7e1d1117d8e9","Type":"ContainerDied","Data":"8b39d38e44caf8f0b98b1d7b2cd1064376393b14e42d9d419ace6352a2e87bea"} Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.127379 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.226360 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-ssh-key\") pod \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.226494 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n78b4\" (UniqueName: \"kubernetes.io/projected/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-kube-api-access-n78b4\") pod \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.226567 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-inventory\") pod \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.226617 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-bootstrap-combined-ca-bundle\") pod \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\" (UID: \"cd7a02af-abd1-4669-88f3-7e1d1117d8e9\") " Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.238597 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "cd7a02af-abd1-4669-88f3-7e1d1117d8e9" (UID: "cd7a02af-abd1-4669-88f3-7e1d1117d8e9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.238692 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-kube-api-access-n78b4" (OuterVolumeSpecName: "kube-api-access-n78b4") pod "cd7a02af-abd1-4669-88f3-7e1d1117d8e9" (UID: "cd7a02af-abd1-4669-88f3-7e1d1117d8e9"). InnerVolumeSpecName "kube-api-access-n78b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.255347 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cd7a02af-abd1-4669-88f3-7e1d1117d8e9" (UID: "cd7a02af-abd1-4669-88f3-7e1d1117d8e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.256933 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-inventory" (OuterVolumeSpecName: "inventory") pod "cd7a02af-abd1-4669-88f3-7e1d1117d8e9" (UID: "cd7a02af-abd1-4669-88f3-7e1d1117d8e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.329351 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.329435 4751 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.329448 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.329457 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n78b4\" (UniqueName: \"kubernetes.io/projected/cd7a02af-abd1-4669-88f3-7e1d1117d8e9-kube-api-access-n78b4\") on node \"crc\" DevicePath \"\"" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.629709 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" event={"ID":"cd7a02af-abd1-4669-88f3-7e1d1117d8e9","Type":"ContainerDied","Data":"73d91f85e74dc43880a40f1b66bf418db23ba4b7c6d077db6e9d4963af4a313a"} Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.629737 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.629751 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d91f85e74dc43880a40f1b66bf418db23ba4b7c6d077db6e9d4963af4a313a" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.707134 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2"] Dec 03 14:44:42 crc kubenswrapper[4751]: E1203 14:44:42.707661 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7a02af-abd1-4669-88f3-7e1d1117d8e9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.707688 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7a02af-abd1-4669-88f3-7e1d1117d8e9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.707937 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7a02af-abd1-4669-88f3-7e1d1117d8e9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.708937 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.711869 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.711870 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.712205 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.713890 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.718937 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2"] Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.736836 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.736994 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.737028 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qq4s\" (UniqueName: \"kubernetes.io/projected/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-kube-api-access-7qq4s\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.839057 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.839117 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qq4s\" (UniqueName: \"kubernetes.io/projected/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-kube-api-access-7qq4s\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.839297 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.845451 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.850245 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:42 crc kubenswrapper[4751]: I1203 14:44:42.858715 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qq4s\" (UniqueName: \"kubernetes.io/projected/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-kube-api-access-7qq4s\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:43 crc kubenswrapper[4751]: I1203 14:44:43.026317 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:44:43 crc kubenswrapper[4751]: I1203 14:44:43.546594 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2"] Dec 03 14:44:43 crc kubenswrapper[4751]: I1203 14:44:43.641168 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" event={"ID":"4f6373bc-f6a3-478f-92f5-8e311a5fd86c","Type":"ContainerStarted","Data":"5600fded04271968ecff2596eb78ad0dc779a5aa832cd70e04c310117fa9386c"} Dec 03 14:44:44 crc kubenswrapper[4751]: I1203 14:44:44.654853 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" event={"ID":"4f6373bc-f6a3-478f-92f5-8e311a5fd86c","Type":"ContainerStarted","Data":"6d9abfc81623affaee19bb54ee6754c41a9cd329a64c93f8af8240ecf97ced25"} Dec 03 14:44:44 crc kubenswrapper[4751]: I1203 14:44:44.671306 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" podStartSLOduration=2.223016964 podStartE2EDuration="2.671285812s" podCreationTimestamp="2025-12-03 14:44:42 +0000 UTC" firstStartedPulling="2025-12-03 14:44:43.54866512 +0000 UTC m=+1890.537020337" lastFinishedPulling="2025-12-03 14:44:43.996933968 +0000 UTC m=+1890.985289185" observedRunningTime="2025-12-03 14:44:44.669471324 +0000 UTC m=+1891.657826541" watchObservedRunningTime="2025-12-03 14:44:44.671285812 +0000 UTC m=+1891.659641029" Dec 03 14:44:50 crc kubenswrapper[4751]: I1203 14:44:50.913418 4751 patch_prober.go:28] interesting pod/route-controller-manager-598f7f77f5-t6xgq container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:44:50 crc kubenswrapper[4751]: I1203 14:44:50.914040 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" podUID="5ec17178-9313-4102-9f0e-4697b5403499" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:44:53 crc kubenswrapper[4751]: I1203 14:44:53.323177 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:44:53 crc kubenswrapper[4751]: E1203 14:44:53.323632 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:44:54 crc kubenswrapper[4751]: I1203 14:44:54.449556 4751 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sq9k5 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:44:54 crc kubenswrapper[4751]: I1203 14:44:54.449703 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-sq9k5" podUID="224b9e4a-5a71-4559-84b6-9599c2dfd321" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:04 crc kubenswrapper[4751]: I1203 14:45:04.313586 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:45:04 crc kubenswrapper[4751]: E1203 14:45:04.315025 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:45:06 crc kubenswrapper[4751]: I1203 14:45:06.250699 4751 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-b5n6q container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.62:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:06 crc kubenswrapper[4751]: I1203 14:45:06.251063 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-b5n6q" podUID="9968046d-e1f9-4644-811b-45e9638d2ed4" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.62:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:10 crc kubenswrapper[4751]: I1203 14:45:10.954531 4751 patch_prober.go:28] interesting pod/route-controller-manager-598f7f77f5-t6xgq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:10 crc kubenswrapper[4751]: I1203 14:45:10.955162 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-598f7f77f5-t6xgq" podUID="5ec17178-9313-4102-9f0e-4697b5403499" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:19 crc kubenswrapper[4751]: I1203 14:45:19.084352 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:19 crc kubenswrapper[4751]: I1203 14:45:19.084374 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:19 crc kubenswrapper[4751]: I1203 14:45:19.315208 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:45:19 crc kubenswrapper[4751]: E1203 14:45:19.315799 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:45:21 crc kubenswrapper[4751]: I1203 14:45:21.270663 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:21 crc kubenswrapper[4751]: I1203 14:45:21.281314 4751 scope.go:117] "RemoveContainer" containerID="9b767235631dce9cdc3c316440d216c56ec79940e4fcf58d759f732d0d5f5ddc" Dec 03 14:45:21 crc kubenswrapper[4751]: I1203 14:45:21.308383 4751 scope.go:117] "RemoveContainer" containerID="9111a215363d6793c77b2eae0f23c40c552f7da495991347f3e64e2098908928" Dec 03 14:45:24 crc kubenswrapper[4751]: I1203 14:45:24.085226 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:24 crc kubenswrapper[4751]: I1203 14:45:24.085420 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:24 crc kubenswrapper[4751]: I1203 14:45:24.159709 4751 patch_prober.go:28] interesting pod/console-6fd456cc64-295n5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:24 crc kubenswrapper[4751]: I1203 14:45:24.159800 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6fd456cc64-295n5" podUID="1fd4536f-efa6-40a7-b329-cebf60907eb2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.69:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:27 crc kubenswrapper[4751]: I1203 14:45:27.500847 4751 patch_prober.go:28] interesting pod/oauth-openshift-cb86fb758-675vf container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:27 crc kubenswrapper[4751]: I1203 14:45:27.501426 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-cb86fb758-675vf" podUID="11bdb7c7-a551-482b-b0c4-d1dc9daaa5e1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:29 crc kubenswrapper[4751]: I1203 14:45:29.084243 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:29 crc kubenswrapper[4751]: I1203 14:45:29.084397 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:29 crc kubenswrapper[4751]: I1203 14:45:29.084601 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 14:45:31 crc kubenswrapper[4751]: I1203 14:45:31.313510 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:31 crc kubenswrapper[4751]: I1203 14:45:31.313866 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:31 crc kubenswrapper[4751]: I1203 14:45:31.674463 4751 patch_prober.go:28] interesting pod/controller-manager-76444f977-mgk4d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:31 crc kubenswrapper[4751]: I1203 14:45:31.674537 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" podUID="44f65b20-49c0-49cf-8b14-fea827c5a3d9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:31 crc kubenswrapper[4751]: I1203 14:45:31.674589 4751 patch_prober.go:28] interesting pod/controller-manager-76444f977-mgk4d container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:31 crc kubenswrapper[4751]: I1203 14:45:31.674642 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" podUID="44f65b20-49c0-49cf-8b14-fea827c5a3d9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:32 crc kubenswrapper[4751]: I1203 14:45:32.086005 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:32 crc kubenswrapper[4751]: I1203 14:45:32.314579 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:45:32 crc kubenswrapper[4751]: E1203 14:45:32.314898 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:45:33 crc kubenswrapper[4751]: I1203 14:45:33.142260 4751 patch_prober.go:28] interesting pod/console-6fd456cc64-295n5 container/console namespace/openshift-console: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:33 crc kubenswrapper[4751]: I1203 14:45:33.142561 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/console-6fd456cc64-295n5" podUID="1fd4536f-efa6-40a7-b329-cebf60907eb2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.69:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:33 crc kubenswrapper[4751]: I1203 14:45:33.142617 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:45:33 crc kubenswrapper[4751]: I1203 14:45:33.143435 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"f4e60a608c6adb657404cd721cbce353e208b2fdb9c45005ffea4a911b9ac844"} pod="openshift-console/console-6fd456cc64-295n5" containerMessage="Container console failed liveness probe, will be restarted" Dec 03 14:45:34 crc kubenswrapper[4751]: I1203 14:45:34.084664 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:34 crc kubenswrapper[4751]: I1203 14:45:34.143062 4751 patch_prober.go:28] interesting pod/console-6fd456cc64-295n5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:34 crc kubenswrapper[4751]: I1203 14:45:34.143181 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6fd456cc64-295n5" podUID="1fd4536f-efa6-40a7-b329-cebf60907eb2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.69:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:35 crc kubenswrapper[4751]: I1203 14:45:35.087133 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:39 crc kubenswrapper[4751]: I1203 14:45:39.084925 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:39 crc kubenswrapper[4751]: I1203 14:45:39.084971 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:41 crc kubenswrapper[4751]: I1203 14:45:41.271543 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:41 crc kubenswrapper[4751]: I1203 14:45:41.272129 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:45:41 crc kubenswrapper[4751]: I1203 14:45:41.674642 4751 patch_prober.go:28] interesting pod/controller-manager-76444f977-mgk4d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:41 crc kubenswrapper[4751]: I1203 14:45:41.674721 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" podUID="44f65b20-49c0-49cf-8b14-fea827c5a3d9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:41 crc kubenswrapper[4751]: I1203 14:45:41.674736 4751 patch_prober.go:28] interesting pod/controller-manager-76444f977-mgk4d container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:41 crc kubenswrapper[4751]: I1203 14:45:41.674826 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-76444f977-mgk4d" podUID="44f65b20-49c0-49cf-8b14-fea827c5a3d9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:42 crc kubenswrapper[4751]: I1203 14:45:42.313522 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:44 crc kubenswrapper[4751]: I1203 14:45:44.085671 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:44 crc kubenswrapper[4751]: I1203 14:45:44.086636 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:44 crc kubenswrapper[4751]: I1203 14:45:44.086691 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 14:45:44 crc kubenswrapper[4751]: I1203 14:45:44.088774 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus" containerStatusID={"Type":"cri-o","ID":"f38f78270fe29b277531b28b4ba8d29de794b6b97f2810a733e3e23cc9177ec2"} pod="openstack/prometheus-metric-storage-0" containerMessage="Container prometheus failed liveness probe, will be restarted" Dec 03 14:45:44 crc kubenswrapper[4751]: I1203 14:45:44.089025 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" containerID="cri-o://f38f78270fe29b277531b28b4ba8d29de794b6b97f2810a733e3e23cc9177ec2" gracePeriod=600 Dec 03 14:45:44 crc kubenswrapper[4751]: I1203 14:45:44.142305 4751 patch_prober.go:28] interesting pod/console-6fd456cc64-295n5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:44 crc kubenswrapper[4751]: I1203 14:45:44.142508 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6fd456cc64-295n5" podUID="1fd4536f-efa6-40a7-b329-cebf60907eb2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.69:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:44 crc kubenswrapper[4751]: I1203 14:45:44.142670 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:45:44 crc kubenswrapper[4751]: I1203 14:45:44.314208 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:45:44 crc kubenswrapper[4751]: E1203 14:45:44.314528 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:45:47 crc kubenswrapper[4751]: I1203 14:45:47.088368 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": context deadline exceeded" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.089649 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.501595 4751 scope.go:117] "RemoveContainer" containerID="0a516c77bbdaaccc84431fc46ea6dc19618d923a60179baa3614c3f3c8a2d408" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.505757 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.616384 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-96fc-account-create-update-4nd9x"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.633602 4751 scope.go:117] "RemoveContainer" containerID="d68cad0cf59293845aa503e209787d751c7f3e42a159b872c3be012db60052e6" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.643382 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-96fc-account-create-update-4nd9x"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.666079 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-6n49f"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.718246 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-6n49f"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.740982 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.742556 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.744966 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.745240 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.759254 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-xtvzh"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.768712 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgb8k\" (UniqueName: \"kubernetes.io/projected/c4277c05-1792-41f8-af0f-3403799bb1e5-kube-api-access-kgb8k\") pod \"collect-profiles-29412885-mhqpt\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.768885 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4277c05-1792-41f8-af0f-3403799bb1e5-config-volume\") pod \"collect-profiles-29412885-mhqpt\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.769009 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4277c05-1792-41f8-af0f-3403799bb1e5-secret-volume\") pod \"collect-profiles-29412885-mhqpt\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.784506 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wbt2f"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.790967 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-xtvzh"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.802267 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wbt2f"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.811119 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bd1d-account-create-update-b5krp"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.821581 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.830431 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ea6b-account-create-update-rslqt"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.839645 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5j7m7"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.848393 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bd1d-account-create-update-b5krp"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.857945 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f6af-account-create-update-zqlks"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.866071 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5j7m7"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.871233 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4277c05-1792-41f8-af0f-3403799bb1e5-config-volume\") pod \"collect-profiles-29412885-mhqpt\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.872892 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgb8k\" (UniqueName: \"kubernetes.io/projected/c4277c05-1792-41f8-af0f-3403799bb1e5-kube-api-access-kgb8k\") pod \"collect-profiles-29412885-mhqpt\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.873317 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4277c05-1792-41f8-af0f-3403799bb1e5-secret-volume\") pod \"collect-profiles-29412885-mhqpt\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.872622 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4277c05-1792-41f8-af0f-3403799bb1e5-config-volume\") pod \"collect-profiles-29412885-mhqpt\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.875549 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f6af-account-create-update-zqlks"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.880355 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4277c05-1792-41f8-af0f-3403799bb1e5-secret-volume\") pod \"collect-profiles-29412885-mhqpt\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.887491 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-g4glx"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.889997 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgb8k\" (UniqueName: \"kubernetes.io/projected/c4277c05-1792-41f8-af0f-3403799bb1e5-kube-api-access-kgb8k\") pod \"collect-profiles-29412885-mhqpt\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.896933 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-cgd8h"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.910299 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ea6b-account-create-update-rslqt"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.922086 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-cgd8h"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.933494 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hmq8c"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.943948 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-g4glx"] Dec 03 14:45:50 crc kubenswrapper[4751]: I1203 14:45:50.956583 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hmq8c"] Dec 03 14:45:51 crc kubenswrapper[4751]: I1203 14:45:51.068154 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:45:51 crc kubenswrapper[4751]: I1203 14:45:51.090493 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:45:51 crc kubenswrapper[4751]: I1203 14:45:51.343356 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce7be68-ea3a-4d2f-b875-9db739a50a8e" path="/var/lib/kubelet/pods/0ce7be68-ea3a-4d2f-b875-9db739a50a8e/volumes" Dec 03 14:45:51 crc kubenswrapper[4751]: I1203 14:45:51.352441 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124c2c16-935c-4bfc-9b53-11ffa90ed441" path="/var/lib/kubelet/pods/124c2c16-935c-4bfc-9b53-11ffa90ed441/volumes" Dec 03 14:45:51 crc kubenswrapper[4751]: I1203 14:45:51.354174 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d1508a-da0e-46be-9f1d-583e1be8d864" path="/var/lib/kubelet/pods/44d1508a-da0e-46be-9f1d-583e1be8d864/volumes" Dec 03 14:45:51 crc kubenswrapper[4751]: I1203 14:45:51.355142 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ea81b69-95de-4772-b7bf-d48f52c298b1" path="/var/lib/kubelet/pods/5ea81b69-95de-4772-b7bf-d48f52c298b1/volumes" Dec 03 14:45:51 crc kubenswrapper[4751]: I1203 14:45:51.356054 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2fa3dba-8033-4b02-b75b-6052face2364" path="/var/lib/kubelet/pods/a2fa3dba-8033-4b02-b75b-6052face2364/volumes" Dec 03 14:45:52 crc kubenswrapper[4751]: I1203 14:45:52.064462 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:45:52 crc kubenswrapper[4751]: I1203 14:45:52.064538 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:45:52 crc kubenswrapper[4751]: I1203 14:45:52.755002 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 03 14:45:56 crc kubenswrapper[4751]: I1203 14:45:56.091038 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:45:57 crc kubenswrapper[4751]: I1203 14:45:57.751655 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 03 14:45:58 crc kubenswrapper[4751]: I1203 14:45:58.191968 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6fd456cc64-295n5" podUID="1fd4536f-efa6-40a7-b329-cebf60907eb2" containerName="console" containerID="cri-o://f4e60a608c6adb657404cd721cbce353e208b2fdb9c45005ffea4a911b9ac844" gracePeriod=15 Dec 03 14:46:00 crc kubenswrapper[4751]: I1203 14:46:00.230511 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 03 14:46:01 crc kubenswrapper[4751]: I1203 14:46:01.090398 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:46:01 crc kubenswrapper[4751]: I1203 14:46:01.993612 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.224:8080/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:02 crc kubenswrapper[4751]: I1203 14:46:02.065599 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:46:02 crc kubenswrapper[4751]: I1203 14:46:02.065666 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:02 crc kubenswrapper[4751]: I1203 14:46:02.751445 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 03 14:46:03 crc kubenswrapper[4751]: I1203 14:46:03.075927 4751 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:46:03 crc kubenswrapper[4751]: I1203 14:46:03.075998 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.534388 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0" path="/var/lib/kubelet/pods/a3a39d7c-ae64-4afa-8426-5cc5d9fc78e0/volumes" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.535021 4751 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 21.34498514s: [/var/lib/containers/storage/overlay/289f9565d1d2ddcfc2d8c4e89424e2bbb489001d83d59be9ca30999122db6f81/diff /var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2_4f6373bc-f6a3-478f-92f5-8e311a5fd86c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.535537 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a65bb3b2-f470-47b3-9031-35a6f9dbc930" path="/var/lib/kubelet/pods/a65bb3b2-f470-47b3-9031-35a6f9dbc930/volumes" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.536466 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c838d53c-d56e-4cfc-a15a-fb91dccb5dbd" path="/var/lib/kubelet/pods/c838d53c-d56e-4cfc-a15a-fb91dccb5dbd/volumes" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.557942 4751 scope.go:117] "RemoveContainer" containerID="78157396ad3252c0c758b997916424c7bd61d4dfa3fd1d188ba326ec5a4965ec" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.559766 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea54721-c31e-48c1-93fd-ba0b9efb5189" path="/var/lib/kubelet/pods/cea54721-c31e-48c1-93fd-ba0b9efb5189/volumes" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.560408 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc533900-0d24-4473-83fe-7653f335a1a9" path="/var/lib/kubelet/pods/dc533900-0d24-4473-83fe-7653f335a1a9/volumes" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.560939 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efcae2e5-8f91-40b7-842a-990dd6b13c66" path="/var/lib/kubelet/pods/efcae2e5-8f91-40b7-842a-990dd6b13c66/volumes" Dec 03 14:46:05 crc kubenswrapper[4751]: E1203 14:46:05.561564 4751 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="14.248s" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.561592 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.561646 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.563220 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"2d91066c11cbed6564b693a73e7f5a2d9d6cc97d9fe2616162ad8832a58832ed"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.563349 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerName="ceilometer-central-agent" containerID="cri-o://2d91066c11cbed6564b693a73e7f5a2d9d6cc97d9fe2616162ad8832a58832ed" gracePeriod=30 Dec 03 14:46:05 crc kubenswrapper[4751]: I1203 14:46:05.707422 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:46:05 crc kubenswrapper[4751]: E1203 14:46:05.710454 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:46:06 crc kubenswrapper[4751]: I1203 14:46:06.090094 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:46:10 crc kubenswrapper[4751]: I1203 14:46:10.230780 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/healthz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 03 14:46:10 crc kubenswrapper[4751]: I1203 14:46:10.230801 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 03 14:46:11 crc kubenswrapper[4751]: I1203 14:46:11.092130 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:46:12 crc kubenswrapper[4751]: I1203 14:46:12.065753 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:46:12 crc kubenswrapper[4751]: I1203 14:46:12.065885 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:12 crc kubenswrapper[4751]: I1203 14:46:12.066035 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:46:13 crc kubenswrapper[4751]: I1203 14:46:13.107619 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:46:13 crc kubenswrapper[4751]: I1203 14:46:13.108097 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:13 crc kubenswrapper[4751]: I1203 14:46:13.142755 4751 patch_prober.go:28] interesting pod/console-6fd456cc64-295n5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/health\": dial tcp 10.217.0.69:8443: connect: connection refused" start-of-body= Dec 03 14:46:13 crc kubenswrapper[4751]: I1203 14:46:13.142817 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6fd456cc64-295n5" podUID="1fd4536f-efa6-40a7-b329-cebf60907eb2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.69:8443/health\": dial tcp 10.217.0.69:8443: connect: connection refused" Dec 03 14:46:13 crc kubenswrapper[4751]: I1203 14:46:13.248654 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" podUID="14960a87-3612-433e-bd1e-b548b0118a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.71:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:16 crc kubenswrapper[4751]: I1203 14:46:16.090850 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:46:17 crc kubenswrapper[4751]: I1203 14:46:17.314562 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:46:17 crc kubenswrapper[4751]: E1203 14:46:17.314930 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:46:18 crc kubenswrapper[4751]: I1203 14:46:18.345974 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:46:18 crc kubenswrapper[4751]: I1203 14:46:18.346478 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:20 crc kubenswrapper[4751]: I1203 14:46:20.231930 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 03 14:46:20 crc kubenswrapper[4751]: I1203 14:46:20.232270 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:46:20 crc kubenswrapper[4751]: I1203 14:46:20.233049 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" podUID="b5d6b394-fe97-4e70-9916-9c6791379931" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 03 14:46:20 crc kubenswrapper[4751]: I1203 14:46:20.293185 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" podUID="2f31e262-8f03-4689-bc29-5d9d8b33a2cc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.87:8081/readyz\": dial tcp 10.217.0.87:8081: connect: connection refused" Dec 03 14:46:20 crc kubenswrapper[4751]: I1203 14:46:20.703688 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" podUID="5c3add92-6cee-4980-903f-692cfd4cf87c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": dial tcp 10.217.0.94:8081: connect: connection refused" Dec 03 14:46:20 crc kubenswrapper[4751]: I1203 14:46:20.981195 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" podUID="adff5e75-192d-4a27-a477-aa74dab8dd95" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": dial tcp 10.217.0.100:8081: connect: connection refused" Dec 03 14:46:21 crc kubenswrapper[4751]: I1203 14:46:21.091216 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:46:21 crc kubenswrapper[4751]: I1203 14:46:21.993905 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.224:8080/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:22 crc kubenswrapper[4751]: I1203 14:46:22.067508 4751 patch_prober.go:28] interesting pod/router-default-5444994796-qjtlv container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:46:22 crc kubenswrapper[4751]: I1203 14:46:22.067801 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-qjtlv" podUID="c4d1b134-55b3-4b2e-92da-b8c5416c13a5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:22 crc kubenswrapper[4751]: I1203 14:46:22.751375 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Dec 03 14:46:23 crc kubenswrapper[4751]: I1203 14:46:23.142315 4751 patch_prober.go:28] interesting pod/console-6fd456cc64-295n5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/health\": dial tcp 10.217.0.69:8443: connect: connection refused" start-of-body= Dec 03 14:46:23 crc kubenswrapper[4751]: I1203 14:46:23.142449 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6fd456cc64-295n5" podUID="1fd4536f-efa6-40a7-b329-cebf60907eb2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.69:8443/health\": dial tcp 10.217.0.69:8443: connect: connection refused" Dec 03 14:46:23 crc kubenswrapper[4751]: I1203 14:46:23.247578 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" podUID="14960a87-3612-433e-bd1e-b548b0118a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.71:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:23 crc kubenswrapper[4751]: I1203 14:46:23.347740 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:46:23 crc kubenswrapper[4751]: I1203 14:46:23.347860 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.415307 4751 patch_prober.go:28] interesting pod/loki-operator-controller-manager-766794d8b8-zzghr container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.53:8081/readyz\": dial tcp 10.217.0.53:8081: connect: connection refused" start-of-body= Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.415656 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" podUID="b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.53:8081/readyz\": dial tcp 10.217.0.53:8081: connect: connection refused" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.635045 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwzp9"] Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.653101 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.734095 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwzp9"] Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.773503 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lfdqd"] Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.776517 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.784930 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfdqd"] Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.827183 4751 scope.go:117] "RemoveContainer" containerID="a611f162d339b901e11e0e03437effbbbe9ed2dce020b7a47ad35124ab7aa5b7" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.960237 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-catalog-content\") pod \"certified-operators-mwzp9\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.962140 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnms\" (UniqueName: \"kubernetes.io/projected/ab718032-f3a1-4c0d-9b9f-d10858e77e05-kube-api-access-ttnms\") pod \"certified-operators-mwzp9\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.962242 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-catalog-content\") pod \"redhat-marketplace-lfdqd\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.962389 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pglbd\" (UniqueName: \"kubernetes.io/projected/08ff7918-db3f-4f2b-8186-8940f86ea3a1-kube-api-access-pglbd\") pod \"redhat-marketplace-lfdqd\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.962409 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-utilities\") pod \"certified-operators-mwzp9\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:24 crc kubenswrapper[4751]: I1203 14:46:24.962441 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-utilities\") pod \"redhat-marketplace-lfdqd\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.073791 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pglbd\" (UniqueName: \"kubernetes.io/projected/08ff7918-db3f-4f2b-8186-8940f86ea3a1-kube-api-access-pglbd\") pod \"redhat-marketplace-lfdqd\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.074217 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-utilities\") pod \"certified-operators-mwzp9\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.074316 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-utilities\") pod \"redhat-marketplace-lfdqd\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.075027 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-catalog-content\") pod \"certified-operators-mwzp9\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.075100 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnms\" (UniqueName: \"kubernetes.io/projected/ab718032-f3a1-4c0d-9b9f-d10858e77e05-kube-api-access-ttnms\") pod \"certified-operators-mwzp9\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.075228 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-utilities\") pod \"redhat-marketplace-lfdqd\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.075239 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-catalog-content\") pod \"redhat-marketplace-lfdqd\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.075627 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-catalog-content\") pod \"redhat-marketplace-lfdqd\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.076693 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-catalog-content\") pod \"certified-operators-mwzp9\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.076762 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-utilities\") pod \"certified-operators-mwzp9\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.115754 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnms\" (UniqueName: \"kubernetes.io/projected/ab718032-f3a1-4c0d-9b9f-d10858e77e05-kube-api-access-ttnms\") pod \"certified-operators-mwzp9\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.126959 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pglbd\" (UniqueName: \"kubernetes.io/projected/08ff7918-db3f-4f2b-8186-8940f86ea3a1-kube-api-access-pglbd\") pod \"redhat-marketplace-lfdqd\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.208524 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.227368 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.266688 4751 scope.go:117] "RemoveContainer" containerID="dcf86151c91f3377fdb3af738f8ad26c4ce523b3540c2fd36e1ac37462e19631" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.412495 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt"] Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.784909 4751 scope.go:117] "RemoveContainer" containerID="13360b1a7e8e5c7cf3f680c1b3d28fa68e7da42d8b3488ab7040dfc6ff71dffb" Dec 03 14:46:25 crc kubenswrapper[4751]: W1203 14:46:25.795123 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4277c05_1792_41f8_af0f_3403799bb1e5.slice/crio-c4cfb70e7adaed0d8a32fc9644fe5db65935d1f2f8efd8997b4f073fb1ff2cbb WatchSource:0}: Error finding container c4cfb70e7adaed0d8a32fc9644fe5db65935d1f2f8efd8997b4f073fb1ff2cbb: Status 404 returned error can't find the container with id c4cfb70e7adaed0d8a32fc9644fe5db65935d1f2f8efd8997b4f073fb1ff2cbb Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.846569 4751 scope.go:117] "RemoveContainer" containerID="b49be2cd041e8658f56e42545f4ba8069c92543eb86398558c0acbcac2011e7b" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.911283 4751 scope.go:117] "RemoveContainer" containerID="280e5644e04e23f95863b6ce318429e83b3341651ad057c40c3a32e7d3dcb109" Dec 03 14:46:25 crc kubenswrapper[4751]: I1203 14:46:25.984927 4751 scope.go:117] "RemoveContainer" containerID="3b38dca0d16b04e991c81cc6650ca0295fc09d83725385feb137aaaebcaece60" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.085051 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b40b7285-42c6-4278-8d86-69847e549907" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.143:9090/-/ready\": dial tcp 10.217.0.143:9090: connect: connection refused" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.149174 4751 scope.go:117] "RemoveContainer" containerID="f7bc41aba9ec034f2c6b6385ae9eeb823d3d608952ed3bab1df3abce42b7c97a" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.280028 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwzp9"] Dec 03 14:46:26 crc kubenswrapper[4751]: W1203 14:46:26.304698 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab718032_f3a1_4c0d_9b9f_d10858e77e05.slice/crio-486ee417dca647c6a7e52234ef336eef5f479ff9633b4a5957ea54475264a10c WatchSource:0}: Error finding container 486ee417dca647c6a7e52234ef336eef5f479ff9633b4a5957ea54475264a10c: Status 404 returned error can't find the container with id 486ee417dca647c6a7e52234ef336eef5f479ff9633b4a5957ea54475264a10c Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.394385 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" podUID="17b09c23-21ca-4060-840d-acbf71e22d55" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8081/readyz\": dial tcp 10.217.0.95:8081: connect: connection refused" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.394668 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" podUID="17b09c23-21ca-4060-840d-acbf71e22d55" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8081/healthz\": dial tcp 10.217.0.95:8081: connect: connection refused" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.428700 4751 generic.go:334] "Generic (PLEG): container finished" podID="b5d6b394-fe97-4e70-9916-9c6791379931" containerID="0e5d451ed6f19cc61897d3f7bb18ffb4c079c4cf8f3f43c29e185da3fc567834" exitCode=1 Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.428780 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" event={"ID":"b5d6b394-fe97-4e70-9916-9c6791379931","Type":"ContainerDied","Data":"0e5d451ed6f19cc61897d3f7bb18ffb4c079c4cf8f3f43c29e185da3fc567834"} Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.429773 4751 scope.go:117] "RemoveContainer" containerID="0e5d451ed6f19cc61897d3f7bb18ffb4c079c4cf8f3f43c29e185da3fc567834" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.433098 4751 generic.go:334] "Generic (PLEG): container finished" podID="14960a87-3612-433e-bd1e-b548b0118a2c" containerID="f251c927cf63330a65f1869f0502f8232b6018f84a656d04fe2d0ed9983cacff" exitCode=1 Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.433189 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" event={"ID":"14960a87-3612-433e-bd1e-b548b0118a2c","Type":"ContainerDied","Data":"f251c927cf63330a65f1869f0502f8232b6018f84a656d04fe2d0ed9983cacff"} Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.434784 4751 scope.go:117] "RemoveContainer" containerID="f251c927cf63330a65f1869f0502f8232b6018f84a656d04fe2d0ed9983cacff" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.436393 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwzp9" event={"ID":"ab718032-f3a1-4c0d-9b9f-d10858e77e05","Type":"ContainerStarted","Data":"486ee417dca647c6a7e52234ef336eef5f479ff9633b4a5957ea54475264a10c"} Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.437992 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfdqd"] Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.441886 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" event={"ID":"c4277c05-1792-41f8-af0f-3403799bb1e5","Type":"ContainerStarted","Data":"c4cfb70e7adaed0d8a32fc9644fe5db65935d1f2f8efd8997b4f073fb1ff2cbb"} Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.452569 4751 scope.go:117] "RemoveContainer" containerID="73ce6034792354e94f3dcd543255d0517330e1e13b4632ba3ce2b3f57fd1ca07" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.453899 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fd456cc64-295n5_1fd4536f-efa6-40a7-b329-cebf60907eb2/console/0.log" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.453948 4751 generic.go:334] "Generic (PLEG): container finished" podID="1fd4536f-efa6-40a7-b329-cebf60907eb2" containerID="f4e60a608c6adb657404cd721cbce353e208b2fdb9c45005ffea4a911b9ac844" exitCode=2 Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.454050 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd456cc64-295n5" event={"ID":"1fd4536f-efa6-40a7-b329-cebf60907eb2","Type":"ContainerDied","Data":"f4e60a608c6adb657404cd721cbce353e208b2fdb9c45005ffea4a911b9ac844"} Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.472098 4751 generic.go:334] "Generic (PLEG): container finished" podID="b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238" containerID="f6815262169e14760e296fcb3eaf51453f58179934c5146aaa334d4deb685928" exitCode=1 Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.472146 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" event={"ID":"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238","Type":"ContainerDied","Data":"f6815262169e14760e296fcb3eaf51453f58179934c5146aaa334d4deb685928"} Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.472936 4751 scope.go:117] "RemoveContainer" containerID="f6815262169e14760e296fcb3eaf51453f58179934c5146aaa334d4deb685928" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.481179 4751 scope.go:117] "RemoveContainer" containerID="38ad4756d3448098303991c1251eca0f74cefb44f1e69fbf6ae5e87305ba54d1" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.518990 4751 scope.go:117] "RemoveContainer" containerID="71909259f7b71b1781f3d3ddb0389c5602b4634d4958fbae473cb1605e015204" Dec 03 14:46:26 crc kubenswrapper[4751]: I1203 14:46:26.585039 4751 scope.go:117] "RemoveContainer" containerID="426def791113e7f82f15c9cbf047c7732046965a4b51fe192603a2521bfc9503" Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.494936 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" event={"ID":"c4277c05-1792-41f8-af0f-3403799bb1e5","Type":"ContainerStarted","Data":"d020baba10dfc88815a4f4ead45aec684e30d56934bac92135269776acd1226f"} Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.543650 4751 generic.go:334] "Generic (PLEG): container finished" podID="2f31e262-8f03-4689-bc29-5d9d8b33a2cc" containerID="6ca05e111868ddd162fb73ecf1ab34dde71e530e27363ba2c3c20ffa9e1d8aa6" exitCode=1 Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.543780 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" event={"ID":"2f31e262-8f03-4689-bc29-5d9d8b33a2cc","Type":"ContainerDied","Data":"6ca05e111868ddd162fb73ecf1ab34dde71e530e27363ba2c3c20ffa9e1d8aa6"} Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.544770 4751 scope.go:117] "RemoveContainer" containerID="6ca05e111868ddd162fb73ecf1ab34dde71e530e27363ba2c3c20ffa9e1d8aa6" Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.579509 4751 generic.go:334] "Generic (PLEG): container finished" podID="5c3add92-6cee-4980-903f-692cfd4cf87c" containerID="a45d50d561a39de70071005f45b1c58232f7854f0905bcb7901b8bf2b3552b02" exitCode=1 Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.579599 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" event={"ID":"5c3add92-6cee-4980-903f-692cfd4cf87c","Type":"ContainerDied","Data":"a45d50d561a39de70071005f45b1c58232f7854f0905bcb7901b8bf2b3552b02"} Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.580245 4751 scope.go:117] "RemoveContainer" containerID="a45d50d561a39de70071005f45b1c58232f7854f0905bcb7901b8bf2b3552b02" Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.605115 4751 generic.go:334] "Generic (PLEG): container finished" podID="adff5e75-192d-4a27-a477-aa74dab8dd95" containerID="fe8b7049ab456064b1692afa14ec9956d6d38613968c83901c087ceeff25407e" exitCode=1 Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.605210 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" event={"ID":"adff5e75-192d-4a27-a477-aa74dab8dd95","Type":"ContainerDied","Data":"fe8b7049ab456064b1692afa14ec9956d6d38613968c83901c087ceeff25407e"} Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.605946 4751 scope.go:117] "RemoveContainer" containerID="fe8b7049ab456064b1692afa14ec9956d6d38613968c83901c087ceeff25407e" Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.653728 4751 generic.go:334] "Generic (PLEG): container finished" podID="b40b7285-42c6-4278-8d86-69847e549907" containerID="f38f78270fe29b277531b28b4ba8d29de794b6b97f2810a733e3e23cc9177ec2" exitCode=0 Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.653852 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b40b7285-42c6-4278-8d86-69847e549907","Type":"ContainerDied","Data":"f38f78270fe29b277531b28b4ba8d29de794b6b97f2810a733e3e23cc9177ec2"} Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.686589 4751 generic.go:334] "Generic (PLEG): container finished" podID="17b09c23-21ca-4060-840d-acbf71e22d55" containerID="d1e026c62840b64d58003f2b576df3b6902ecbfa981988e00f836f33f3f4dd1f" exitCode=1 Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.686709 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" event={"ID":"17b09c23-21ca-4060-840d-acbf71e22d55","Type":"ContainerDied","Data":"d1e026c62840b64d58003f2b576df3b6902ecbfa981988e00f836f33f3f4dd1f"} Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.687497 4751 scope.go:117] "RemoveContainer" containerID="d1e026c62840b64d58003f2b576df3b6902ecbfa981988e00f836f33f3f4dd1f" Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.690693 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfdqd" event={"ID":"08ff7918-db3f-4f2b-8186-8940f86ea3a1","Type":"ContainerStarted","Data":"94322d839dd041fc16e7fb3b9f5b6adf870fe04c0d6fab0683f7a2e36686594b"} Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.701945 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.713809 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.727619 4751 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="77d49b793a0bbcc88ec0d68b879c40f84c85f3443b31de826bffb957126e3419" exitCode=1 Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.727667 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"77d49b793a0bbcc88ec0d68b879c40f84c85f3443b31de826bffb957126e3419"} Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.727699 4751 scope.go:117] "RemoveContainer" containerID="b2c72f448233567212d225809e4e369559a32cc82f42a0597f815c654c956fc6" Dec 03 14:46:27 crc kubenswrapper[4751]: I1203 14:46:27.728404 4751 scope.go:117] "RemoveContainer" containerID="77d49b793a0bbcc88ec0d68b879c40f84c85f3443b31de826bffb957126e3419" Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.742031 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" event={"ID":"14960a87-3612-433e-bd1e-b548b0118a2c","Type":"ContainerStarted","Data":"71f3d17fa91a3afec878633f3c8d7551085988bbbe0d0d7cdccdad35b242826f"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.743002 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.747230 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" event={"ID":"17b09c23-21ca-4060-840d-acbf71e22d55","Type":"ContainerStarted","Data":"29d72e2531b5bb4845657592787e8581e04fd226dbb8b1f01e7100e89045e21a"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.748250 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.751930 4751 generic.go:334] "Generic (PLEG): container finished" podID="c4277c05-1792-41f8-af0f-3403799bb1e5" containerID="d020baba10dfc88815a4f4ead45aec684e30d56934bac92135269776acd1226f" exitCode=0 Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.752110 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" event={"ID":"c4277c05-1792-41f8-af0f-3403799bb1e5","Type":"ContainerDied","Data":"d020baba10dfc88815a4f4ead45aec684e30d56934bac92135269776acd1226f"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.754131 4751 generic.go:334] "Generic (PLEG): container finished" podID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerID="07acb99af9b13a91535c132272f9d134e6869bf725624b89c30b1fb347ddde81" exitCode=0 Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.754181 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfdqd" event={"ID":"08ff7918-db3f-4f2b-8186-8940f86ea3a1","Type":"ContainerDied","Data":"07acb99af9b13a91535c132272f9d134e6869bf725624b89c30b1fb347ddde81"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.756411 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.765447 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b40b7285-42c6-4278-8d86-69847e549907","Type":"ContainerStarted","Data":"9ec8bc8c0f3a69e32286a9846a37def09c8051160f88468f18283248aaab2ef3"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.769544 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fd456cc64-295n5_1fd4536f-efa6-40a7-b329-cebf60907eb2/console/0.log" Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.769614 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd456cc64-295n5" event={"ID":"1fd4536f-efa6-40a7-b329-cebf60907eb2","Type":"ContainerStarted","Data":"c05a02cfa91a83728edad14e00729055560756f95fc0b209a492bf42b3e39fd5"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.778130 4751 generic.go:334] "Generic (PLEG): container finished" podID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerID="768d30658d28279686c9bf91c22e3cde48b2036dcd8afcd1cb5338d4b8bc594e" exitCode=0 Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.778242 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwzp9" event={"ID":"ab718032-f3a1-4c0d-9b9f-d10858e77e05","Type":"ContainerDied","Data":"768d30658d28279686c9bf91c22e3cde48b2036dcd8afcd1cb5338d4b8bc594e"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.798034 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" event={"ID":"5c3add92-6cee-4980-903f-692cfd4cf87c","Type":"ContainerStarted","Data":"946785a81dd6e3a4a945325fc416ff37269a726ca2d248ba62cd36a5511f8786"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.799199 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.808412 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" event={"ID":"adff5e75-192d-4a27-a477-aa74dab8dd95","Type":"ContainerStarted","Data":"feae7826a8d4d797d9ebeed8f14ff8ef51dc253977770968ccecb59a4548e7fc"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.809510 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.816669 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" event={"ID":"b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238","Type":"ContainerStarted","Data":"472ff657b6ab1237a8a23d148dbd89b056194eb1602b6e1ab2258262a2cb7c7b"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.817766 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.822122 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" event={"ID":"b5d6b394-fe97-4e70-9916-9c6791379931","Type":"ContainerStarted","Data":"de5ef23e196ca4d2e86f53c7b1d7c7cef3d184fcc3ddb7d9ce0ee7dd2d21d247"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.823834 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.838210 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.844370 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"27e77578a35a949843ebb86b1643632220877c1df820291017042e9bcb5cc516"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.847645 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" event={"ID":"2f31e262-8f03-4689-bc29-5d9d8b33a2cc","Type":"ContainerStarted","Data":"8c47ad00e7afa00cbce3e726a457e2902ed27eeddf136f67894707189ece5f09"} Dec 03 14:46:28 crc kubenswrapper[4751]: I1203 14:46:28.848514 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" Dec 03 14:46:29 crc kubenswrapper[4751]: I1203 14:46:29.646320 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:46:29 crc kubenswrapper[4751]: I1203 14:46:29.883884 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwzp9" event={"ID":"ab718032-f3a1-4c0d-9b9f-d10858e77e05","Type":"ContainerStarted","Data":"16193709942b32c3234ac7000fe024ff9e53db577f3532da5fea04742ac34ac3"} Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.396927 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.546151 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4277c05-1792-41f8-af0f-3403799bb1e5-secret-volume\") pod \"c4277c05-1792-41f8-af0f-3403799bb1e5\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.546241 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgb8k\" (UniqueName: \"kubernetes.io/projected/c4277c05-1792-41f8-af0f-3403799bb1e5-kube-api-access-kgb8k\") pod \"c4277c05-1792-41f8-af0f-3403799bb1e5\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.546299 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4277c05-1792-41f8-af0f-3403799bb1e5-config-volume\") pod \"c4277c05-1792-41f8-af0f-3403799bb1e5\" (UID: \"c4277c05-1792-41f8-af0f-3403799bb1e5\") " Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.547067 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4277c05-1792-41f8-af0f-3403799bb1e5-config-volume" (OuterVolumeSpecName: "config-volume") pod "c4277c05-1792-41f8-af0f-3403799bb1e5" (UID: "c4277c05-1792-41f8-af0f-3403799bb1e5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.547616 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4277c05-1792-41f8-af0f-3403799bb1e5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.560594 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4277c05-1792-41f8-af0f-3403799bb1e5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c4277c05-1792-41f8-af0f-3403799bb1e5" (UID: "c4277c05-1792-41f8-af0f-3403799bb1e5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.560644 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4277c05-1792-41f8-af0f-3403799bb1e5-kube-api-access-kgb8k" (OuterVolumeSpecName: "kube-api-access-kgb8k") pod "c4277c05-1792-41f8-af0f-3403799bb1e5" (UID: "c4277c05-1792-41f8-af0f-3403799bb1e5"). InnerVolumeSpecName "kube-api-access-kgb8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.649248 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4277c05-1792-41f8-af0f-3403799bb1e5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.649304 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgb8k\" (UniqueName: \"kubernetes.io/projected/c4277c05-1792-41f8-af0f-3403799bb1e5-kube-api-access-kgb8k\") on node \"crc\" DevicePath \"\"" Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.897501 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" event={"ID":"c4277c05-1792-41f8-af0f-3403799bb1e5","Type":"ContainerDied","Data":"c4cfb70e7adaed0d8a32fc9644fe5db65935d1f2f8efd8997b4f073fb1ff2cbb"} Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.897555 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4cfb70e7adaed0d8a32fc9644fe5db65935d1f2f8efd8997b4f073fb1ff2cbb" Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.897520 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt" Dec 03 14:46:30 crc kubenswrapper[4751]: I1203 14:46:30.902659 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfdqd" event={"ID":"08ff7918-db3f-4f2b-8186-8940f86ea3a1","Type":"ContainerStarted","Data":"e736bfcf377083d314fe5011e4b9d49f9aa5ee8486db220b9c23495ae4bc390f"} Dec 03 14:46:31 crc kubenswrapper[4751]: I1203 14:46:31.031950 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qjtlv" Dec 03 14:46:31 crc kubenswrapper[4751]: I1203 14:46:31.083961 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 14:46:31 crc kubenswrapper[4751]: I1203 14:46:31.084008 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 14:46:31 crc kubenswrapper[4751]: I1203 14:46:31.091448 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 14:46:31 crc kubenswrapper[4751]: I1203 14:46:31.314359 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:46:31 crc kubenswrapper[4751]: E1203 14:46:31.315048 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:46:31 crc kubenswrapper[4751]: I1203 14:46:31.917450 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 14:46:32 crc kubenswrapper[4751]: I1203 14:46:32.926875 4751 generic.go:334] "Generic (PLEG): container finished" podID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerID="16193709942b32c3234ac7000fe024ff9e53db577f3532da5fea04742ac34ac3" exitCode=0 Dec 03 14:46:32 crc kubenswrapper[4751]: I1203 14:46:32.926949 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwzp9" event={"ID":"ab718032-f3a1-4c0d-9b9f-d10858e77e05","Type":"ContainerDied","Data":"16193709942b32c3234ac7000fe024ff9e53db577f3532da5fea04742ac34ac3"} Dec 03 14:46:32 crc kubenswrapper[4751]: I1203 14:46:32.930038 4751 generic.go:334] "Generic (PLEG): container finished" podID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerID="2d91066c11cbed6564b693a73e7f5a2d9d6cc97d9fe2616162ad8832a58832ed" exitCode=0 Dec 03 14:46:32 crc kubenswrapper[4751]: I1203 14:46:32.930095 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991","Type":"ContainerDied","Data":"2d91066c11cbed6564b693a73e7f5a2d9d6cc97d9fe2616162ad8832a58832ed"} Dec 03 14:46:32 crc kubenswrapper[4751]: I1203 14:46:32.932940 4751 generic.go:334] "Generic (PLEG): container finished" podID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerID="e736bfcf377083d314fe5011e4b9d49f9aa5ee8486db220b9c23495ae4bc390f" exitCode=0 Dec 03 14:46:32 crc kubenswrapper[4751]: I1203 14:46:32.933209 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfdqd" event={"ID":"08ff7918-db3f-4f2b-8186-8940f86ea3a1","Type":"ContainerDied","Data":"e736bfcf377083d314fe5011e4b9d49f9aa5ee8486db220b9c23495ae4bc390f"} Dec 03 14:46:33 crc kubenswrapper[4751]: I1203 14:46:33.141238 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:46:33 crc kubenswrapper[4751]: I1203 14:46:33.141583 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:46:33 crc kubenswrapper[4751]: I1203 14:46:33.144736 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:46:33 crc kubenswrapper[4751]: I1203 14:46:33.452416 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-766794d8b8-zzghr" Dec 03 14:46:33 crc kubenswrapper[4751]: I1203 14:46:33.954763 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fd456cc64-295n5" Dec 03 14:46:34 crc kubenswrapper[4751]: I1203 14:46:34.963278 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwzp9" event={"ID":"ab718032-f3a1-4c0d-9b9f-d10858e77e05","Type":"ContainerStarted","Data":"9a3f8e051728cff61da9e0a6b0a5b2b2ccc570f307086ca777aa11719c4aa57c"} Dec 03 14:46:34 crc kubenswrapper[4751]: I1203 14:46:34.968559 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991","Type":"ContainerStarted","Data":"36bc6b57b58a8794795c3a4e4871f5dd701a0d2a674e71c37e4ebb63aa474050"} Dec 03 14:46:34 crc kubenswrapper[4751]: I1203 14:46:34.973808 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfdqd" event={"ID":"08ff7918-db3f-4f2b-8186-8940f86ea3a1","Type":"ContainerStarted","Data":"ff496b8d495a90ac8f1b9439bf83dbd2ccfddcfa4747f68c9f77a0d441c548a2"} Dec 03 14:46:34 crc kubenswrapper[4751]: I1203 14:46:34.998450 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwzp9" podStartSLOduration=6.235330714 podStartE2EDuration="10.998423878s" podCreationTimestamp="2025-12-03 14:46:24 +0000 UTC" firstStartedPulling="2025-12-03 14:46:28.780256634 +0000 UTC m=+1995.768611851" lastFinishedPulling="2025-12-03 14:46:33.543349798 +0000 UTC m=+2000.531705015" observedRunningTime="2025-12-03 14:46:34.984704775 +0000 UTC m=+2001.973060012" watchObservedRunningTime="2025-12-03 14:46:34.998423878 +0000 UTC m=+2001.986779095" Dec 03 14:46:35 crc kubenswrapper[4751]: I1203 14:46:35.034825 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lfdqd" podStartSLOduration=6.145726186 podStartE2EDuration="11.034801409s" podCreationTimestamp="2025-12-03 14:46:24 +0000 UTC" firstStartedPulling="2025-12-03 14:46:28.756012854 +0000 UTC m=+1995.744368071" lastFinishedPulling="2025-12-03 14:46:33.645088077 +0000 UTC m=+2000.633443294" observedRunningTime="2025-12-03 14:46:35.030364112 +0000 UTC m=+2002.018719329" watchObservedRunningTime="2025-12-03 14:46:35.034801409 +0000 UTC m=+2002.023156626" Dec 03 14:46:35 crc kubenswrapper[4751]: I1203 14:46:35.210178 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:35 crc kubenswrapper[4751]: I1203 14:46:35.210499 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:35 crc kubenswrapper[4751]: I1203 14:46:35.228345 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:35 crc kubenswrapper[4751]: I1203 14:46:35.228528 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:36 crc kubenswrapper[4751]: I1203 14:46:36.126609 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:46:36 crc kubenswrapper[4751]: I1203 14:46:36.132634 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:46:36 crc kubenswrapper[4751]: I1203 14:46:36.267522 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mwzp9" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerName="registry-server" probeResult="failure" output=< Dec 03 14:46:36 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 14:46:36 crc kubenswrapper[4751]: > Dec 03 14:46:36 crc kubenswrapper[4751]: I1203 14:46:36.294227 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-lfdqd" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerName="registry-server" probeResult="failure" output=< Dec 03 14:46:36 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 14:46:36 crc kubenswrapper[4751]: > Dec 03 14:46:36 crc kubenswrapper[4751]: I1203 14:46:36.402986 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b" Dec 03 14:46:39 crc kubenswrapper[4751]: I1203 14:46:39.650821 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:46:40 crc kubenswrapper[4751]: I1203 14:46:40.232787 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8hhdk" Dec 03 14:46:40 crc kubenswrapper[4751]: I1203 14:46:40.295635 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxjk5" Dec 03 14:46:40 crc kubenswrapper[4751]: I1203 14:46:40.707153 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8h62v" Dec 03 14:46:40 crc kubenswrapper[4751]: I1203 14:46:40.984936 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lgjvh" Dec 03 14:46:44 crc kubenswrapper[4751]: I1203 14:46:44.314570 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:46:45 crc kubenswrapper[4751]: I1203 14:46:45.080043 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"9a6c556859f08944ddc255eca28ac397f02ede386d6011e2ad67e3baa1641a38"} Dec 03 14:46:45 crc kubenswrapper[4751]: I1203 14:46:45.281976 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:45 crc kubenswrapper[4751]: I1203 14:46:45.302552 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:45 crc kubenswrapper[4751]: I1203 14:46:45.385889 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:45 crc kubenswrapper[4751]: I1203 14:46:45.423374 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:47 crc kubenswrapper[4751]: I1203 14:46:47.547371 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwzp9"] Dec 03 14:46:47 crc kubenswrapper[4751]: I1203 14:46:47.548121 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwzp9" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerName="registry-server" containerID="cri-o://9a3f8e051728cff61da9e0a6b0a5b2b2ccc570f307086ca777aa11719c4aa57c" gracePeriod=2 Dec 03 14:46:47 crc kubenswrapper[4751]: I1203 14:46:47.748500 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfdqd"] Dec 03 14:46:47 crc kubenswrapper[4751]: I1203 14:46:47.748790 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lfdqd" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerName="registry-server" containerID="cri-o://ff496b8d495a90ac8f1b9439bf83dbd2ccfddcfa4747f68c9f77a0d441c548a2" gracePeriod=2 Dec 03 14:46:47 crc kubenswrapper[4751]: I1203 14:46:47.814678 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerName="ceilometer-notification-agent" probeResult="failure" output=< Dec 03 14:46:47 crc kubenswrapper[4751]: Unkown error: Expecting value: line 1 column 1 (char 0) Dec 03 14:46:47 crc kubenswrapper[4751]: > Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.126412 4751 generic.go:334] "Generic (PLEG): container finished" podID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerID="ff496b8d495a90ac8f1b9439bf83dbd2ccfddcfa4747f68c9f77a0d441c548a2" exitCode=0 Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.126465 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfdqd" event={"ID":"08ff7918-db3f-4f2b-8186-8940f86ea3a1","Type":"ContainerDied","Data":"ff496b8d495a90ac8f1b9439bf83dbd2ccfddcfa4747f68c9f77a0d441c548a2"} Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.170763 4751 generic.go:334] "Generic (PLEG): container finished" podID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerID="9a3f8e051728cff61da9e0a6b0a5b2b2ccc570f307086ca777aa11719c4aa57c" exitCode=0 Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.170818 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwzp9" event={"ID":"ab718032-f3a1-4c0d-9b9f-d10858e77e05","Type":"ContainerDied","Data":"9a3f8e051728cff61da9e0a6b0a5b2b2ccc570f307086ca777aa11719c4aa57c"} Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.413799 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.594787 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-catalog-content\") pod \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.595161 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-utilities\") pod \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.595302 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttnms\" (UniqueName: \"kubernetes.io/projected/ab718032-f3a1-4c0d-9b9f-d10858e77e05-kube-api-access-ttnms\") pod \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\" (UID: \"ab718032-f3a1-4c0d-9b9f-d10858e77e05\") " Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.596069 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-utilities" (OuterVolumeSpecName: "utilities") pod "ab718032-f3a1-4c0d-9b9f-d10858e77e05" (UID: "ab718032-f3a1-4c0d-9b9f-d10858e77e05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.603436 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab718032-f3a1-4c0d-9b9f-d10858e77e05-kube-api-access-ttnms" (OuterVolumeSpecName: "kube-api-access-ttnms") pod "ab718032-f3a1-4c0d-9b9f-d10858e77e05" (UID: "ab718032-f3a1-4c0d-9b9f-d10858e77e05"). InnerVolumeSpecName "kube-api-access-ttnms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.609801 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.670724 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab718032-f3a1-4c0d-9b9f-d10858e77e05" (UID: "ab718032-f3a1-4c0d-9b9f-d10858e77e05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.698205 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-catalog-content\") pod \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.698580 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pglbd\" (UniqueName: \"kubernetes.io/projected/08ff7918-db3f-4f2b-8186-8940f86ea3a1-kube-api-access-pglbd\") pod \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.698612 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-utilities\") pod \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\" (UID: \"08ff7918-db3f-4f2b-8186-8940f86ea3a1\") " Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.699446 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.699463 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab718032-f3a1-4c0d-9b9f-d10858e77e05-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.699476 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttnms\" (UniqueName: \"kubernetes.io/projected/ab718032-f3a1-4c0d-9b9f-d10858e77e05-kube-api-access-ttnms\") on node \"crc\" DevicePath \"\"" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.699656 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-utilities" (OuterVolumeSpecName: "utilities") pod "08ff7918-db3f-4f2b-8186-8940f86ea3a1" (UID: "08ff7918-db3f-4f2b-8186-8940f86ea3a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.707598 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ff7918-db3f-4f2b-8186-8940f86ea3a1-kube-api-access-pglbd" (OuterVolumeSpecName: "kube-api-access-pglbd") pod "08ff7918-db3f-4f2b-8186-8940f86ea3a1" (UID: "08ff7918-db3f-4f2b-8186-8940f86ea3a1"). InnerVolumeSpecName "kube-api-access-pglbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.736400 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08ff7918-db3f-4f2b-8186-8940f86ea3a1" (UID: "08ff7918-db3f-4f2b-8186-8940f86ea3a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.801486 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.801547 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pglbd\" (UniqueName: \"kubernetes.io/projected/08ff7918-db3f-4f2b-8186-8940f86ea3a1-kube-api-access-pglbd\") on node \"crc\" DevicePath \"\"" Dec 03 14:46:48 crc kubenswrapper[4751]: I1203 14:46:48.801566 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ff7918-db3f-4f2b-8186-8940f86ea3a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.182195 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfdqd" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.182218 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfdqd" event={"ID":"08ff7918-db3f-4f2b-8186-8940f86ea3a1","Type":"ContainerDied","Data":"94322d839dd041fc16e7fb3b9f5b6adf870fe04c0d6fab0683f7a2e36686594b"} Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.182272 4751 scope.go:117] "RemoveContainer" containerID="ff496b8d495a90ac8f1b9439bf83dbd2ccfddcfa4747f68c9f77a0d441c548a2" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.185378 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwzp9" event={"ID":"ab718032-f3a1-4c0d-9b9f-d10858e77e05","Type":"ContainerDied","Data":"486ee417dca647c6a7e52234ef336eef5f479ff9633b4a5957ea54475264a10c"} Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.185468 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwzp9" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.226961 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfdqd"] Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.229022 4751 scope.go:117] "RemoveContainer" containerID="e736bfcf377083d314fe5011e4b9d49f9aa5ee8486db220b9c23495ae4bc390f" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.236791 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfdqd"] Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.249269 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwzp9"] Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.258571 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwzp9"] Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.272166 4751 scope.go:117] "RemoveContainer" containerID="07acb99af9b13a91535c132272f9d134e6869bf725624b89c30b1fb347ddde81" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.305643 4751 scope.go:117] "RemoveContainer" containerID="9a3f8e051728cff61da9e0a6b0a5b2b2ccc570f307086ca777aa11719c4aa57c" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.325842 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" path="/var/lib/kubelet/pods/08ff7918-db3f-4f2b-8186-8940f86ea3a1/volumes" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.326792 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" path="/var/lib/kubelet/pods/ab718032-f3a1-4c0d-9b9f-d10858e77e05/volumes" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.359160 4751 scope.go:117] "RemoveContainer" containerID="16193709942b32c3234ac7000fe024ff9e53db577f3532da5fea04742ac34ac3" Dec 03 14:46:49 crc kubenswrapper[4751]: I1203 14:46:49.392791 4751 scope.go:117] "RemoveContainer" containerID="768d30658d28279686c9bf91c22e3cde48b2036dcd8afcd1cb5338d4b8bc594e" Dec 03 14:46:50 crc kubenswrapper[4751]: I1203 14:46:50.700493 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wbwn4"] Dec 03 14:46:50 crc kubenswrapper[4751]: I1203 14:46:50.720480 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wbwn4"] Dec 03 14:46:50 crc kubenswrapper[4751]: I1203 14:46:50.742395 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-l4676"] Dec 03 14:46:50 crc kubenswrapper[4751]: I1203 14:46:50.754219 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-l4676"] Dec 03 14:46:50 crc kubenswrapper[4751]: I1203 14:46:50.774382 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-5qnpr"] Dec 03 14:46:50 crc kubenswrapper[4751]: I1203 14:46:50.783936 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7kzxp"] Dec 03 14:46:50 crc kubenswrapper[4751]: I1203 14:46:50.799387 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-5qnpr"] Dec 03 14:46:50 crc kubenswrapper[4751]: I1203 14:46:50.809291 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7kzxp"] Dec 03 14:46:51 crc kubenswrapper[4751]: I1203 14:46:51.358358 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4784bf8d-4315-4097-b729-1f21940a17bc" path="/var/lib/kubelet/pods/4784bf8d-4315-4097-b729-1f21940a17bc/volumes" Dec 03 14:46:51 crc kubenswrapper[4751]: I1203 14:46:51.368429 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b52c852-92ed-47f2-8e47-a9ac1e378698" path="/var/lib/kubelet/pods/6b52c852-92ed-47f2-8e47-a9ac1e378698/volumes" Dec 03 14:46:51 crc kubenswrapper[4751]: I1203 14:46:51.368993 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe09550-cc72-4fe9-af45-b39fbcac540d" path="/var/lib/kubelet/pods/cbe09550-cc72-4fe9-af45-b39fbcac540d/volumes" Dec 03 14:46:51 crc kubenswrapper[4751]: I1203 14:46:51.369551 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5612b5-9ee0-4da5-84b0-402ce8b1a163" path="/var/lib/kubelet/pods/de5612b5-9ee0-4da5-84b0-402ce8b1a163/volumes" Dec 03 14:47:02 crc kubenswrapper[4751]: I1203 14:47:02.210656 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-75cdb5998d-hbntt" Dec 03 14:47:09 crc kubenswrapper[4751]: I1203 14:47:09.035709 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rjchq"] Dec 03 14:47:09 crc kubenswrapper[4751]: I1203 14:47:09.058902 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-kg74g"] Dec 03 14:47:09 crc kubenswrapper[4751]: I1203 14:47:09.076279 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rjchq"] Dec 03 14:47:09 crc kubenswrapper[4751]: I1203 14:47:09.083837 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d629-account-create-update-jjx45"] Dec 03 14:47:09 crc kubenswrapper[4751]: I1203 14:47:09.093628 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-kg74g"] Dec 03 14:47:09 crc kubenswrapper[4751]: I1203 14:47:09.101317 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d629-account-create-update-jjx45"] Dec 03 14:47:09 crc kubenswrapper[4751]: I1203 14:47:09.339213 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14804484-7da3-4876-b9f5-a3f996fdca5c" path="/var/lib/kubelet/pods/14804484-7da3-4876-b9f5-a3f996fdca5c/volumes" Dec 03 14:47:09 crc kubenswrapper[4751]: I1203 14:47:09.340613 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c1be9b7-88c6-4c30-8b86-6b3194a6a225" path="/var/lib/kubelet/pods/5c1be9b7-88c6-4c30-8b86-6b3194a6a225/volumes" Dec 03 14:47:09 crc kubenswrapper[4751]: I1203 14:47:09.341582 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca" path="/var/lib/kubelet/pods/8c061b2d-f85e-46e5-b0c2-b5d900cfc6ca/volumes" Dec 03 14:47:10 crc kubenswrapper[4751]: I1203 14:47:10.032670 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b6f3-account-create-update-k2rkr"] Dec 03 14:47:10 crc kubenswrapper[4751]: I1203 14:47:10.045922 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wb89v"] Dec 03 14:47:10 crc kubenswrapper[4751]: I1203 14:47:10.057140 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wb89v"] Dec 03 14:47:10 crc kubenswrapper[4751]: I1203 14:47:10.068177 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1fbd-account-create-update-wgwqm"] Dec 03 14:47:10 crc kubenswrapper[4751]: I1203 14:47:10.079922 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b6f3-account-create-update-k2rkr"] Dec 03 14:47:10 crc kubenswrapper[4751]: I1203 14:47:10.089997 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1fbd-account-create-update-wgwqm"] Dec 03 14:47:11 crc kubenswrapper[4751]: I1203 14:47:11.327191 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d206c76-a64f-42dc-8bc0-554d21e5ebd0" path="/var/lib/kubelet/pods/0d206c76-a64f-42dc-8bc0-554d21e5ebd0/volumes" Dec 03 14:47:11 crc kubenswrapper[4751]: I1203 14:47:11.328127 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac653c7-0be2-4943-ba54-d1e1fed5bfdc" path="/var/lib/kubelet/pods/8ac653c7-0be2-4943-ba54-d1e1fed5bfdc/volumes" Dec 03 14:47:11 crc kubenswrapper[4751]: I1203 14:47:11.329133 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b90d17b-c6be-47a0-a72f-669ad0c75e18" path="/var/lib/kubelet/pods/8b90d17b-c6be-47a0-a72f-669ad0c75e18/volumes" Dec 03 14:47:17 crc kubenswrapper[4751]: I1203 14:47:17.522406 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerName="ceilometer-notification-agent" probeResult="failure" output=< Dec 03 14:47:17 crc kubenswrapper[4751]: Unkown error: Expecting value: line 1 column 1 (char 0) Dec 03 14:47:17 crc kubenswrapper[4751]: > Dec 03 14:47:17 crc kubenswrapper[4751]: I1203 14:47:17.523157 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Dec 03 14:47:17 crc kubenswrapper[4751]: I1203 14:47:17.524055 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-notification-agent" containerStatusID={"Type":"cri-o","ID":"513f69a8a3af975c03de90408fdf4a4a794083da84aca38a3196c9b31a655521"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-notification-agent failed liveness probe, will be restarted" Dec 03 14:47:17 crc kubenswrapper[4751]: I1203 14:47:17.524107 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerName="ceilometer-notification-agent" containerID="cri-o://513f69a8a3af975c03de90408fdf4a4a794083da84aca38a3196c9b31a655521" gracePeriod=30 Dec 03 14:47:23 crc kubenswrapper[4751]: I1203 14:47:23.578503 4751 generic.go:334] "Generic (PLEG): container finished" podID="ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991" containerID="513f69a8a3af975c03de90408fdf4a4a794083da84aca38a3196c9b31a655521" exitCode=0 Dec 03 14:47:23 crc kubenswrapper[4751]: I1203 14:47:23.578554 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991","Type":"ContainerDied","Data":"513f69a8a3af975c03de90408fdf4a4a794083da84aca38a3196c9b31a655521"} Dec 03 14:47:24 crc kubenswrapper[4751]: I1203 14:47:24.606963 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991","Type":"ContainerStarted","Data":"9fb41d5fcd75547e3eaa1513479c148d04d17b3374eb6e960b0fdde48a08a7b3"} Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.114109 4751 scope.go:117] "RemoveContainer" containerID="943207f8bee76a544dfcefd1215a3a958b2fcf485c82c9fc006f4e2c19b307f3" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.145114 4751 scope.go:117] "RemoveContainer" containerID="e5f89e275c7378fd8bc629e5f986d230a3cb73a1eb06035c5a68369abfcfa9f0" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.214958 4751 scope.go:117] "RemoveContainer" containerID="e1746dd96f8c08d0da13592803ced15a8ff13a0de0f9381bab3eb3d05be54272" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.263629 4751 scope.go:117] "RemoveContainer" containerID="8a59515ad8337978d1a0f9f51673dd0b11ee05971d42cc6e483edb9ffd7b6709" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.293221 4751 scope.go:117] "RemoveContainer" containerID="a21e3be16085a36b9d730c18dbbb117c4a4510686de75a90f1c97f8617afea25" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.352812 4751 scope.go:117] "RemoveContainer" containerID="c200e104b10c75481a28bbf12e9c5a1364b8594e64931b14d113348be698099c" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.406295 4751 scope.go:117] "RemoveContainer" containerID="d51d21a867de6344470db63eb2c384e4e73323da2b6cdf183a8d6f807808ea80" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.428402 4751 scope.go:117] "RemoveContainer" containerID="c78f2a2a892dfa058d90620129ac2102adcab0af7f9a041586dd6dd61088d9cf" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.449879 4751 scope.go:117] "RemoveContainer" containerID="dda574fe47d3ecdf3017ff88709fdadda5d6e3c421d008a8019d5ea8feac7bb9" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.474549 4751 scope.go:117] "RemoveContainer" containerID="dcf778ddadc5f54d3034973dcacf17fd0a1ff3587c86cf599022d0cd3691425f" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.503463 4751 scope.go:117] "RemoveContainer" containerID="5438a700d64967e58768961ff7f29dad0258aa33d36bfba8d852666d1d3b1078" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.529932 4751 scope.go:117] "RemoveContainer" containerID="1c63976e3db62586993d795217316bd30f339b8441e6b64054a5db80e8519f5c" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.582862 4751 scope.go:117] "RemoveContainer" containerID="99ed7eb161b12788b367b86c391e8674f70f3493789083d8cd49b477a968c1cc" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.618918 4751 scope.go:117] "RemoveContainer" containerID="3e4ed144a1aee447944ebfd6b9128868ad08ae8df64f5ba6dcb3cc573652eeeb" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.655995 4751 scope.go:117] "RemoveContainer" containerID="11f039596c97f40659e89fd38dc13bbc8e5349c3783d348023ee6a1fe9a639df" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.707432 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l2d5b"] Dec 03 14:47:27 crc kubenswrapper[4751]: E1203 14:47:27.708031 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4277c05-1792-41f8-af0f-3403799bb1e5" containerName="collect-profiles" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708070 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4277c05-1792-41f8-af0f-3403799bb1e5" containerName="collect-profiles" Dec 03 14:47:27 crc kubenswrapper[4751]: E1203 14:47:27.708098 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerName="extract-content" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708109 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerName="extract-content" Dec 03 14:47:27 crc kubenswrapper[4751]: E1203 14:47:27.708141 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerName="extract-content" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708150 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerName="extract-content" Dec 03 14:47:27 crc kubenswrapper[4751]: E1203 14:47:27.708168 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerName="extract-utilities" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708179 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerName="extract-utilities" Dec 03 14:47:27 crc kubenswrapper[4751]: E1203 14:47:27.708192 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerName="extract-utilities" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708200 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerName="extract-utilities" Dec 03 14:47:27 crc kubenswrapper[4751]: E1203 14:47:27.708234 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerName="registry-server" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708243 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerName="registry-server" Dec 03 14:47:27 crc kubenswrapper[4751]: E1203 14:47:27.708260 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerName="registry-server" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708270 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerName="registry-server" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708608 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4277c05-1792-41f8-af0f-3403799bb1e5" containerName="collect-profiles" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708630 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab718032-f3a1-4c0d-9b9f-d10858e77e05" containerName="registry-server" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.708647 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ff7918-db3f-4f2b-8186-8940f86ea3a1" containerName="registry-server" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.710729 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.719910 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2d5b"] Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.876351 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-catalog-content\") pod \"community-operators-l2d5b\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.876969 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-utilities\") pod \"community-operators-l2d5b\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.877040 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhs6b\" (UniqueName: \"kubernetes.io/projected/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-kube-api-access-mhs6b\") pod \"community-operators-l2d5b\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.978838 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-catalog-content\") pod \"community-operators-l2d5b\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.978976 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-utilities\") pod \"community-operators-l2d5b\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.979013 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhs6b\" (UniqueName: \"kubernetes.io/projected/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-kube-api-access-mhs6b\") pod \"community-operators-l2d5b\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.979435 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-catalog-content\") pod \"community-operators-l2d5b\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:27 crc kubenswrapper[4751]: I1203 14:47:27.979811 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-utilities\") pod \"community-operators-l2d5b\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:28 crc kubenswrapper[4751]: I1203 14:47:28.010853 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhs6b\" (UniqueName: \"kubernetes.io/projected/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-kube-api-access-mhs6b\") pod \"community-operators-l2d5b\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:28 crc kubenswrapper[4751]: I1203 14:47:28.032973 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:28 crc kubenswrapper[4751]: I1203 14:47:28.714085 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2d5b"] Dec 03 14:47:28 crc kubenswrapper[4751]: W1203 14:47:28.715375 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod536fad3b_a4e4_4a58_b9ae_f7937c2bcec7.slice/crio-8eb4429380d08d38d6e0510e7ef07ebff1dd22bb5a8130d4dd078d649ad4afed WatchSource:0}: Error finding container 8eb4429380d08d38d6e0510e7ef07ebff1dd22bb5a8130d4dd078d649ad4afed: Status 404 returned error can't find the container with id 8eb4429380d08d38d6e0510e7ef07ebff1dd22bb5a8130d4dd078d649ad4afed Dec 03 14:47:29 crc kubenswrapper[4751]: I1203 14:47:29.678955 4751 generic.go:334] "Generic (PLEG): container finished" podID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerID="1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08" exitCode=0 Dec 03 14:47:29 crc kubenswrapper[4751]: I1203 14:47:29.679046 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2d5b" event={"ID":"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7","Type":"ContainerDied","Data":"1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08"} Dec 03 14:47:29 crc kubenswrapper[4751]: I1203 14:47:29.679277 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2d5b" event={"ID":"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7","Type":"ContainerStarted","Data":"8eb4429380d08d38d6e0510e7ef07ebff1dd22bb5a8130d4dd078d649ad4afed"} Dec 03 14:47:30 crc kubenswrapper[4751]: I1203 14:47:30.690073 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2d5b" event={"ID":"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7","Type":"ContainerStarted","Data":"4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab"} Dec 03 14:47:31 crc kubenswrapper[4751]: I1203 14:47:31.706729 4751 generic.go:334] "Generic (PLEG): container finished" podID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerID="4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab" exitCode=0 Dec 03 14:47:31 crc kubenswrapper[4751]: I1203 14:47:31.706867 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2d5b" event={"ID":"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7","Type":"ContainerDied","Data":"4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab"} Dec 03 14:47:33 crc kubenswrapper[4751]: I1203 14:47:33.742891 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2d5b" event={"ID":"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7","Type":"ContainerStarted","Data":"eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47"} Dec 03 14:47:33 crc kubenswrapper[4751]: I1203 14:47:33.775067 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l2d5b" podStartSLOduration=3.553489919 podStartE2EDuration="6.775041809s" podCreationTimestamp="2025-12-03 14:47:27 +0000 UTC" firstStartedPulling="2025-12-03 14:47:29.680924237 +0000 UTC m=+2056.669279454" lastFinishedPulling="2025-12-03 14:47:32.902476127 +0000 UTC m=+2059.890831344" observedRunningTime="2025-12-03 14:47:33.764595733 +0000 UTC m=+2060.752950970" watchObservedRunningTime="2025-12-03 14:47:33.775041809 +0000 UTC m=+2060.763397026" Dec 03 14:47:38 crc kubenswrapper[4751]: I1203 14:47:38.033999 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:38 crc kubenswrapper[4751]: I1203 14:47:38.035665 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:38 crc kubenswrapper[4751]: I1203 14:47:38.086625 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:38 crc kubenswrapper[4751]: I1203 14:47:38.831901 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:38 crc kubenswrapper[4751]: I1203 14:47:38.877803 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2d5b"] Dec 03 14:47:40 crc kubenswrapper[4751]: I1203 14:47:40.802961 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l2d5b" podUID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerName="registry-server" containerID="cri-o://eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47" gracePeriod=2 Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.333497 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.400625 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-catalog-content\") pod \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.401000 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhs6b\" (UniqueName: \"kubernetes.io/projected/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-kube-api-access-mhs6b\") pod \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.401032 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-utilities\") pod \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\" (UID: \"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7\") " Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.402058 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-utilities" (OuterVolumeSpecName: "utilities") pod "536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" (UID: "536fad3b-a4e4-4a58-b9ae-f7937c2bcec7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.407583 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-kube-api-access-mhs6b" (OuterVolumeSpecName: "kube-api-access-mhs6b") pod "536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" (UID: "536fad3b-a4e4-4a58-b9ae-f7937c2bcec7"). InnerVolumeSpecName "kube-api-access-mhs6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.454294 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" (UID: "536fad3b-a4e4-4a58-b9ae-f7937c2bcec7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.503451 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.503484 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhs6b\" (UniqueName: \"kubernetes.io/projected/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-kube-api-access-mhs6b\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.503497 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.819464 4751 generic.go:334] "Generic (PLEG): container finished" podID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerID="eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47" exitCode=0 Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.819505 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2d5b" event={"ID":"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7","Type":"ContainerDied","Data":"eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47"} Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.819532 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2d5b" event={"ID":"536fad3b-a4e4-4a58-b9ae-f7937c2bcec7","Type":"ContainerDied","Data":"8eb4429380d08d38d6e0510e7ef07ebff1dd22bb5a8130d4dd078d649ad4afed"} Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.819551 4751 scope.go:117] "RemoveContainer" containerID="eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.819560 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2d5b" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.872653 4751 scope.go:117] "RemoveContainer" containerID="4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.892135 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2d5b"] Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.908468 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l2d5b"] Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.911587 4751 scope.go:117] "RemoveContainer" containerID="1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.954546 4751 scope.go:117] "RemoveContainer" containerID="eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47" Dec 03 14:47:41 crc kubenswrapper[4751]: E1203 14:47:41.955011 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47\": container with ID starting with eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47 not found: ID does not exist" containerID="eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.955055 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47"} err="failed to get container status \"eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47\": rpc error: code = NotFound desc = could not find container \"eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47\": container with ID starting with eb368313a2c70e6c9c3484f1fedd2b194cbac165bbeb832337a864262519be47 not found: ID does not exist" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.955093 4751 scope.go:117] "RemoveContainer" containerID="4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab" Dec 03 14:47:41 crc kubenswrapper[4751]: E1203 14:47:41.955562 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab\": container with ID starting with 4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab not found: ID does not exist" containerID="4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.955606 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab"} err="failed to get container status \"4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab\": rpc error: code = NotFound desc = could not find container \"4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab\": container with ID starting with 4f14cae0b05b924c4cbfbd568b104b6602b1f19cc3f1f17cc2450aa0b3420fab not found: ID does not exist" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.955634 4751 scope.go:117] "RemoveContainer" containerID="1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08" Dec 03 14:47:41 crc kubenswrapper[4751]: E1203 14:47:41.955918 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08\": container with ID starting with 1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08 not found: ID does not exist" containerID="1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08" Dec 03 14:47:41 crc kubenswrapper[4751]: I1203 14:47:41.955943 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08"} err="failed to get container status \"1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08\": rpc error: code = NotFound desc = could not find container \"1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08\": container with ID starting with 1ae8ca355d6e05e547023358b2b9698211700d0122f3decfe5ae6f6cbcd6aa08 not found: ID does not exist" Dec 03 14:47:43 crc kubenswrapper[4751]: I1203 14:47:43.331114 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" path="/var/lib/kubelet/pods/536fad3b-a4e4-4a58-b9ae-f7937c2bcec7/volumes" Dec 03 14:47:50 crc kubenswrapper[4751]: I1203 14:47:50.039176 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7bpp"] Dec 03 14:47:50 crc kubenswrapper[4751]: I1203 14:47:50.047142 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7bpp"] Dec 03 14:47:51 crc kubenswrapper[4751]: I1203 14:47:51.324962 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df882c6-828a-4aac-8aa9-811102008952" path="/var/lib/kubelet/pods/7df882c6-828a-4aac-8aa9-811102008952/volumes" Dec 03 14:47:53 crc kubenswrapper[4751]: I1203 14:47:53.942711 4751 generic.go:334] "Generic (PLEG): container finished" podID="4f6373bc-f6a3-478f-92f5-8e311a5fd86c" containerID="6d9abfc81623affaee19bb54ee6754c41a9cd329a64c93f8af8240ecf97ced25" exitCode=0 Dec 03 14:47:53 crc kubenswrapper[4751]: I1203 14:47:53.942883 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" event={"ID":"4f6373bc-f6a3-478f-92f5-8e311a5fd86c","Type":"ContainerDied","Data":"6d9abfc81623affaee19bb54ee6754c41a9cd329a64c93f8af8240ecf97ced25"} Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.405340 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.514539 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-ssh-key\") pod \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.514823 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-inventory\") pod \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.514930 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qq4s\" (UniqueName: \"kubernetes.io/projected/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-kube-api-access-7qq4s\") pod \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\" (UID: \"4f6373bc-f6a3-478f-92f5-8e311a5fd86c\") " Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.524186 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-kube-api-access-7qq4s" (OuterVolumeSpecName: "kube-api-access-7qq4s") pod "4f6373bc-f6a3-478f-92f5-8e311a5fd86c" (UID: "4f6373bc-f6a3-478f-92f5-8e311a5fd86c"). InnerVolumeSpecName "kube-api-access-7qq4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.550539 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f6373bc-f6a3-478f-92f5-8e311a5fd86c" (UID: "4f6373bc-f6a3-478f-92f5-8e311a5fd86c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.559555 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-inventory" (OuterVolumeSpecName: "inventory") pod "4f6373bc-f6a3-478f-92f5-8e311a5fd86c" (UID: "4f6373bc-f6a3-478f-92f5-8e311a5fd86c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.617721 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.617783 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.617800 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qq4s\" (UniqueName: \"kubernetes.io/projected/4f6373bc-f6a3-478f-92f5-8e311a5fd86c-kube-api-access-7qq4s\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.969369 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.969358 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2" event={"ID":"4f6373bc-f6a3-478f-92f5-8e311a5fd86c","Type":"ContainerDied","Data":"5600fded04271968ecff2596eb78ad0dc779a5aa832cd70e04c310117fa9386c"} Dec 03 14:47:55 crc kubenswrapper[4751]: I1203 14:47:55.969527 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5600fded04271968ecff2596eb78ad0dc779a5aa832cd70e04c310117fa9386c" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.048435 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6"] Dec 03 14:47:56 crc kubenswrapper[4751]: E1203 14:47:56.048829 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerName="extract-utilities" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.048847 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerName="extract-utilities" Dec 03 14:47:56 crc kubenswrapper[4751]: E1203 14:47:56.048884 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerName="extract-content" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.048893 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerName="extract-content" Dec 03 14:47:56 crc kubenswrapper[4751]: E1203 14:47:56.048902 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerName="registry-server" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.048909 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerName="registry-server" Dec 03 14:47:56 crc kubenswrapper[4751]: E1203 14:47:56.048918 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6373bc-f6a3-478f-92f5-8e311a5fd86c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.048925 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6373bc-f6a3-478f-92f5-8e311a5fd86c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.049101 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6373bc-f6a3-478f-92f5-8e311a5fd86c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.049126 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="536fad3b-a4e4-4a58-b9ae-f7937c2bcec7" containerName="registry-server" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.049854 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.051706 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.052202 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.052615 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.058881 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.064351 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6"] Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.128257 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.128570 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.128736 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmd79\" (UniqueName: \"kubernetes.io/projected/d5de9d69-621e-4336-bd1d-e29c27d29430-kube-api-access-hmd79\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.230843 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.230972 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.231086 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmd79\" (UniqueName: \"kubernetes.io/projected/d5de9d69-621e-4336-bd1d-e29c27d29430-kube-api-access-hmd79\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.235538 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.246030 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.246712 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmd79\" (UniqueName: \"kubernetes.io/projected/d5de9d69-621e-4336-bd1d-e29c27d29430-kube-api-access-hmd79\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.376034 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.921177 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6"] Dec 03 14:47:56 crc kubenswrapper[4751]: I1203 14:47:56.979514 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" event={"ID":"d5de9d69-621e-4336-bd1d-e29c27d29430","Type":"ContainerStarted","Data":"e7a6eca9989de68303770cc1ca45e8a4a7248bc43019828a4031708b9c02ccf9"} Dec 03 14:47:57 crc kubenswrapper[4751]: I1203 14:47:57.990871 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" event={"ID":"d5de9d69-621e-4336-bd1d-e29c27d29430","Type":"ContainerStarted","Data":"95a3217db3ea14055315fa21dd32b4f0ef2316b1ba5cbdbdead1e820583d6f71"} Dec 03 14:47:58 crc kubenswrapper[4751]: I1203 14:47:58.032302 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" podStartSLOduration=1.625341769 podStartE2EDuration="2.032273397s" podCreationTimestamp="2025-12-03 14:47:56 +0000 UTC" firstStartedPulling="2025-12-03 14:47:56.931355538 +0000 UTC m=+2083.919710755" lastFinishedPulling="2025-12-03 14:47:57.338287156 +0000 UTC m=+2084.326642383" observedRunningTime="2025-12-03 14:47:58.008903857 +0000 UTC m=+2084.997259094" watchObservedRunningTime="2025-12-03 14:47:58.032273397 +0000 UTC m=+2085.020628614" Dec 03 14:48:15 crc kubenswrapper[4751]: I1203 14:48:15.032792 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-ccbpw"] Dec 03 14:48:15 crc kubenswrapper[4751]: I1203 14:48:15.046518 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-ccbpw"] Dec 03 14:48:15 crc kubenswrapper[4751]: I1203 14:48:15.326283 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57bcc78c-1540-47ee-82f6-664aff4f6216" path="/var/lib/kubelet/pods/57bcc78c-1540-47ee-82f6-664aff4f6216/volumes" Dec 03 14:48:16 crc kubenswrapper[4751]: I1203 14:48:16.061233 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lhljr"] Dec 03 14:48:16 crc kubenswrapper[4751]: I1203 14:48:16.077895 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lhljr"] Dec 03 14:48:17 crc kubenswrapper[4751]: I1203 14:48:17.328463 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528661c5-7b80-48d3-b8fd-7d20c23932f7" path="/var/lib/kubelet/pods/528661c5-7b80-48d3-b8fd-7d20c23932f7/volumes" Dec 03 14:48:28 crc kubenswrapper[4751]: I1203 14:48:28.064223 4751 scope.go:117] "RemoveContainer" containerID="22704b894a6936c8545cf27f710a115b5cf9c5d5092f3977d94ab8a65ac2ab36" Dec 03 14:48:28 crc kubenswrapper[4751]: I1203 14:48:28.112710 4751 scope.go:117] "RemoveContainer" containerID="74931be11a9b42eb7d9b58dba9ad8007cc1df42f646f017355650aac7bedb0ba" Dec 03 14:48:28 crc kubenswrapper[4751]: I1203 14:48:28.157254 4751 scope.go:117] "RemoveContainer" containerID="1f0e9aff22382cad550073805ca2c207dcafb0ce675cc850c7b00be3dd9b6a89" Dec 03 14:49:00 crc kubenswrapper[4751]: I1203 14:49:00.053952 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fpjws"] Dec 03 14:49:00 crc kubenswrapper[4751]: I1203 14:49:00.064352 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fpjws"] Dec 03 14:49:01 crc kubenswrapper[4751]: I1203 14:49:01.324439 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c1f3f2-32ce-4652-8962-99c0d111d953" path="/var/lib/kubelet/pods/12c1f3f2-32ce-4652-8962-99c0d111d953/volumes" Dec 03 14:49:05 crc kubenswrapper[4751]: I1203 14:49:05.820203 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:49:05 crc kubenswrapper[4751]: I1203 14:49:05.820725 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:49:08 crc kubenswrapper[4751]: I1203 14:49:08.860016 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5de9d69-621e-4336-bd1d-e29c27d29430" containerID="95a3217db3ea14055315fa21dd32b4f0ef2316b1ba5cbdbdead1e820583d6f71" exitCode=0 Dec 03 14:49:08 crc kubenswrapper[4751]: I1203 14:49:08.860148 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" event={"ID":"d5de9d69-621e-4336-bd1d-e29c27d29430","Type":"ContainerDied","Data":"95a3217db3ea14055315fa21dd32b4f0ef2316b1ba5cbdbdead1e820583d6f71"} Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.163946 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-25qrp"] Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.166360 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.189627 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25qrp"] Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.228424 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-utilities\") pod \"redhat-operators-25qrp\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.228607 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-catalog-content\") pod \"redhat-operators-25qrp\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.228657 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggbqn\" (UniqueName: \"kubernetes.io/projected/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-kube-api-access-ggbqn\") pod \"redhat-operators-25qrp\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.340125 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-utilities\") pod \"redhat-operators-25qrp\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.340467 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-catalog-content\") pod \"redhat-operators-25qrp\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.340544 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggbqn\" (UniqueName: \"kubernetes.io/projected/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-kube-api-access-ggbqn\") pod \"redhat-operators-25qrp\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.341539 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-utilities\") pod \"redhat-operators-25qrp\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.341861 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-catalog-content\") pod \"redhat-operators-25qrp\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.381036 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggbqn\" (UniqueName: \"kubernetes.io/projected/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-kube-api-access-ggbqn\") pod \"redhat-operators-25qrp\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.488399 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:09 crc kubenswrapper[4751]: I1203 14:49:09.965291 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25qrp"] Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.513763 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.670031 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-ssh-key\") pod \"d5de9d69-621e-4336-bd1d-e29c27d29430\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.670887 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-inventory\") pod \"d5de9d69-621e-4336-bd1d-e29c27d29430\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.671033 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmd79\" (UniqueName: \"kubernetes.io/projected/d5de9d69-621e-4336-bd1d-e29c27d29430-kube-api-access-hmd79\") pod \"d5de9d69-621e-4336-bd1d-e29c27d29430\" (UID: \"d5de9d69-621e-4336-bd1d-e29c27d29430\") " Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.696109 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5de9d69-621e-4336-bd1d-e29c27d29430-kube-api-access-hmd79" (OuterVolumeSpecName: "kube-api-access-hmd79") pod "d5de9d69-621e-4336-bd1d-e29c27d29430" (UID: "d5de9d69-621e-4336-bd1d-e29c27d29430"). InnerVolumeSpecName "kube-api-access-hmd79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.704281 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5de9d69-621e-4336-bd1d-e29c27d29430" (UID: "d5de9d69-621e-4336-bd1d-e29c27d29430"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.715380 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-inventory" (OuterVolumeSpecName: "inventory") pod "d5de9d69-621e-4336-bd1d-e29c27d29430" (UID: "d5de9d69-621e-4336-bd1d-e29c27d29430"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.774156 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmd79\" (UniqueName: \"kubernetes.io/projected/d5de9d69-621e-4336-bd1d-e29c27d29430-kube-api-access-hmd79\") on node \"crc\" DevicePath \"\"" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.774456 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.774466 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5de9d69-621e-4336-bd1d-e29c27d29430-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.880610 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" event={"ID":"d5de9d69-621e-4336-bd1d-e29c27d29430","Type":"ContainerDied","Data":"e7a6eca9989de68303770cc1ca45e8a4a7248bc43019828a4031708b9c02ccf9"} Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.880656 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7a6eca9989de68303770cc1ca45e8a4a7248bc43019828a4031708b9c02ccf9" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.880706 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.883361 4751 generic.go:334] "Generic (PLEG): container finished" podID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerID="161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a" exitCode=0 Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.883388 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25qrp" event={"ID":"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8","Type":"ContainerDied","Data":"161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a"} Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.883404 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25qrp" event={"ID":"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8","Type":"ContainerStarted","Data":"01ca56d8bedc2d69d4bffd829c275ca51757922f260311e81241dc6089d6ec5b"} Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.975012 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z"] Dec 03 14:49:10 crc kubenswrapper[4751]: E1203 14:49:10.976034 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5de9d69-621e-4336-bd1d-e29c27d29430" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.976053 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5de9d69-621e-4336-bd1d-e29c27d29430" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.976246 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5de9d69-621e-4336-bd1d-e29c27d29430" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.978355 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.981943 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.982717 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.983588 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.989056 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:49:10 crc kubenswrapper[4751]: I1203 14:49:10.993537 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z"] Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.080625 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.080707 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb2n4\" (UniqueName: \"kubernetes.io/projected/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-kube-api-access-hb2n4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.080817 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.183243 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.183303 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb2n4\" (UniqueName: \"kubernetes.io/projected/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-kube-api-access-hb2n4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.183405 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.187496 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.190966 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.202214 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb2n4\" (UniqueName: \"kubernetes.io/projected/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-kube-api-access-hb2n4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.302297 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.877016 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z"] Dec 03 14:49:11 crc kubenswrapper[4751]: W1203 14:49:11.882196 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae9b241a_73d5_4f1c_b14d_f7b44cc008f1.slice/crio-c8f14fecc89d014e3784d56053b00b1e6e1298f450558c980a095dae9441d2db WatchSource:0}: Error finding container c8f14fecc89d014e3784d56053b00b1e6e1298f450558c980a095dae9441d2db: Status 404 returned error can't find the container with id c8f14fecc89d014e3784d56053b00b1e6e1298f450558c980a095dae9441d2db Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.892886 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" event={"ID":"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1","Type":"ContainerStarted","Data":"c8f14fecc89d014e3784d56053b00b1e6e1298f450558c980a095dae9441d2db"} Dec 03 14:49:11 crc kubenswrapper[4751]: I1203 14:49:11.895388 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25qrp" event={"ID":"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8","Type":"ContainerStarted","Data":"4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f"} Dec 03 14:49:12 crc kubenswrapper[4751]: I1203 14:49:12.906927 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" event={"ID":"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1","Type":"ContainerStarted","Data":"78f164e3070095e57a7aa132eae4db67876653aa0def3fac92ac9920dd9854c9"} Dec 03 14:49:13 crc kubenswrapper[4751]: I1203 14:49:13.938624 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" podStartSLOduration=3.375949571 podStartE2EDuration="3.938602535s" podCreationTimestamp="2025-12-03 14:49:10 +0000 UTC" firstStartedPulling="2025-12-03 14:49:11.886115954 +0000 UTC m=+2158.874471161" lastFinishedPulling="2025-12-03 14:49:12.448768898 +0000 UTC m=+2159.437124125" observedRunningTime="2025-12-03 14:49:13.932085982 +0000 UTC m=+2160.920441209" watchObservedRunningTime="2025-12-03 14:49:13.938602535 +0000 UTC m=+2160.926957752" Dec 03 14:49:15 crc kubenswrapper[4751]: I1203 14:49:15.938140 4751 generic.go:334] "Generic (PLEG): container finished" podID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerID="4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f" exitCode=0 Dec 03 14:49:15 crc kubenswrapper[4751]: I1203 14:49:15.938278 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25qrp" event={"ID":"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8","Type":"ContainerDied","Data":"4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f"} Dec 03 14:49:16 crc kubenswrapper[4751]: I1203 14:49:16.953502 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25qrp" event={"ID":"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8","Type":"ContainerStarted","Data":"8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0"} Dec 03 14:49:16 crc kubenswrapper[4751]: I1203 14:49:16.971660 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-25qrp" podStartSLOduration=2.511216728 podStartE2EDuration="7.971636529s" podCreationTimestamp="2025-12-03 14:49:09 +0000 UTC" firstStartedPulling="2025-12-03 14:49:10.88517571 +0000 UTC m=+2157.873530928" lastFinishedPulling="2025-12-03 14:49:16.345595502 +0000 UTC m=+2163.333950729" observedRunningTime="2025-12-03 14:49:16.970723704 +0000 UTC m=+2163.959078931" watchObservedRunningTime="2025-12-03 14:49:16.971636529 +0000 UTC m=+2163.959991776" Dec 03 14:49:17 crc kubenswrapper[4751]: I1203 14:49:17.972830 4751 generic.go:334] "Generic (PLEG): container finished" podID="ae9b241a-73d5-4f1c-b14d-f7b44cc008f1" containerID="78f164e3070095e57a7aa132eae4db67876653aa0def3fac92ac9920dd9854c9" exitCode=0 Dec 03 14:49:17 crc kubenswrapper[4751]: I1203 14:49:17.973067 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" event={"ID":"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1","Type":"ContainerDied","Data":"78f164e3070095e57a7aa132eae4db67876653aa0def3fac92ac9920dd9854c9"} Dec 03 14:49:18 crc kubenswrapper[4751]: E1203 14:49:18.000677 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae9b241a_73d5_4f1c_b14d_f7b44cc008f1.slice/crio-78f164e3070095e57a7aa132eae4db67876653aa0def3fac92ac9920dd9854c9.scope\": RecentStats: unable to find data in memory cache]" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.489798 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.490305 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.492991 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.589032 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb2n4\" (UniqueName: \"kubernetes.io/projected/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-kube-api-access-hb2n4\") pod \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.589174 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-ssh-key\") pod \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.589280 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-inventory\") pod \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\" (UID: \"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1\") " Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.599478 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-kube-api-access-hb2n4" (OuterVolumeSpecName: "kube-api-access-hb2n4") pod "ae9b241a-73d5-4f1c-b14d-f7b44cc008f1" (UID: "ae9b241a-73d5-4f1c-b14d-f7b44cc008f1"). InnerVolumeSpecName "kube-api-access-hb2n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.624527 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-inventory" (OuterVolumeSpecName: "inventory") pod "ae9b241a-73d5-4f1c-b14d-f7b44cc008f1" (UID: "ae9b241a-73d5-4f1c-b14d-f7b44cc008f1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.627381 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ae9b241a-73d5-4f1c-b14d-f7b44cc008f1" (UID: "ae9b241a-73d5-4f1c-b14d-f7b44cc008f1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.692263 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.692298 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.692312 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb2n4\" (UniqueName: \"kubernetes.io/projected/ae9b241a-73d5-4f1c-b14d-f7b44cc008f1-kube-api-access-hb2n4\") on node \"crc\" DevicePath \"\"" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.990965 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" event={"ID":"ae9b241a-73d5-4f1c-b14d-f7b44cc008f1","Type":"ContainerDied","Data":"c8f14fecc89d014e3784d56053b00b1e6e1298f450558c980a095dae9441d2db"} Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.991004 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8f14fecc89d014e3784d56053b00b1e6e1298f450558c980a095dae9441d2db" Dec 03 14:49:19 crc kubenswrapper[4751]: I1203 14:49:19.991008 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.095884 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd"] Dec 03 14:49:20 crc kubenswrapper[4751]: E1203 14:49:20.096436 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9b241a-73d5-4f1c-b14d-f7b44cc008f1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.096461 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9b241a-73d5-4f1c-b14d-f7b44cc008f1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.096755 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9b241a-73d5-4f1c-b14d-f7b44cc008f1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.097719 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.099652 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.100050 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.100117 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.100273 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.113633 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd"] Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.201340 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blkvd\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.201411 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blkvd\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.201506 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpks4\" (UniqueName: \"kubernetes.io/projected/8eeba461-aadb-44d9-ac60-9413a2c70e6d-kube-api-access-rpks4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blkvd\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.303682 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blkvd\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.303765 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blkvd\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.303825 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpks4\" (UniqueName: \"kubernetes.io/projected/8eeba461-aadb-44d9-ac60-9413a2c70e6d-kube-api-access-rpks4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blkvd\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.308221 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blkvd\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.325026 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blkvd\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.339073 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpks4\" (UniqueName: \"kubernetes.io/projected/8eeba461-aadb-44d9-ac60-9413a2c70e6d-kube-api-access-rpks4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blkvd\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.418883 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:49:20 crc kubenswrapper[4751]: I1203 14:49:20.594940 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-25qrp" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerName="registry-server" probeResult="failure" output=< Dec 03 14:49:20 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 14:49:20 crc kubenswrapper[4751]: > Dec 03 14:49:21 crc kubenswrapper[4751]: I1203 14:49:21.039690 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd"] Dec 03 14:49:21 crc kubenswrapper[4751]: W1203 14:49:21.043874 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eeba461_aadb_44d9_ac60_9413a2c70e6d.slice/crio-45be5c138eee7940401dbba9ede358fab68f11c547504eb3c156fcc2cc2caddd WatchSource:0}: Error finding container 45be5c138eee7940401dbba9ede358fab68f11c547504eb3c156fcc2cc2caddd: Status 404 returned error can't find the container with id 45be5c138eee7940401dbba9ede358fab68f11c547504eb3c156fcc2cc2caddd Dec 03 14:49:22 crc kubenswrapper[4751]: I1203 14:49:22.008914 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" event={"ID":"8eeba461-aadb-44d9-ac60-9413a2c70e6d","Type":"ContainerStarted","Data":"45be5c138eee7940401dbba9ede358fab68f11c547504eb3c156fcc2cc2caddd"} Dec 03 14:49:23 crc kubenswrapper[4751]: I1203 14:49:23.019472 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" event={"ID":"8eeba461-aadb-44d9-ac60-9413a2c70e6d","Type":"ContainerStarted","Data":"4fb0b85edda35cb6efb1d7ed436fffba19b844ceee71432499acb4fef5d5bf49"} Dec 03 14:49:23 crc kubenswrapper[4751]: I1203 14:49:23.040037 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" podStartSLOduration=2.395101167 podStartE2EDuration="3.040017266s" podCreationTimestamp="2025-12-03 14:49:20 +0000 UTC" firstStartedPulling="2025-12-03 14:49:21.046242333 +0000 UTC m=+2168.034597550" lastFinishedPulling="2025-12-03 14:49:21.691158432 +0000 UTC m=+2168.679513649" observedRunningTime="2025-12-03 14:49:23.036829581 +0000 UTC m=+2170.025184818" watchObservedRunningTime="2025-12-03 14:49:23.040017266 +0000 UTC m=+2170.028372483" Dec 03 14:49:28 crc kubenswrapper[4751]: I1203 14:49:28.312726 4751 scope.go:117] "RemoveContainer" containerID="450cb5444d1d441152daf714ab2046fad875605c51ef348e83b3599b8c6f3dc4" Dec 03 14:49:29 crc kubenswrapper[4751]: I1203 14:49:29.541905 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:29 crc kubenswrapper[4751]: I1203 14:49:29.592660 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:29 crc kubenswrapper[4751]: I1203 14:49:29.785425 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25qrp"] Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.102372 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-25qrp" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerName="registry-server" containerID="cri-o://8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0" gracePeriod=2 Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.696892 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.745965 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggbqn\" (UniqueName: \"kubernetes.io/projected/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-kube-api-access-ggbqn\") pod \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.746040 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-utilities\") pod \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.746146 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-catalog-content\") pod \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\" (UID: \"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8\") " Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.747189 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-utilities" (OuterVolumeSpecName: "utilities") pod "50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" (UID: "50d35e6b-6a28-4822-857e-2f8c1ac2ffd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.751367 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-kube-api-access-ggbqn" (OuterVolumeSpecName: "kube-api-access-ggbqn") pod "50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" (UID: "50d35e6b-6a28-4822-857e-2f8c1ac2ffd8"). InnerVolumeSpecName "kube-api-access-ggbqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.850602 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggbqn\" (UniqueName: \"kubernetes.io/projected/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-kube-api-access-ggbqn\") on node \"crc\" DevicePath \"\"" Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.850674 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.868567 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" (UID: "50d35e6b-6a28-4822-857e-2f8c1ac2ffd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:49:31 crc kubenswrapper[4751]: I1203 14:49:31.952239 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.119992 4751 generic.go:334] "Generic (PLEG): container finished" podID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerID="8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0" exitCode=0 Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.120106 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25qrp" event={"ID":"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8","Type":"ContainerDied","Data":"8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0"} Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.120636 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25qrp" event={"ID":"50d35e6b-6a28-4822-857e-2f8c1ac2ffd8","Type":"ContainerDied","Data":"01ca56d8bedc2d69d4bffd829c275ca51757922f260311e81241dc6089d6ec5b"} Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.120670 4751 scope.go:117] "RemoveContainer" containerID="8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.120163 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25qrp" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.155689 4751 scope.go:117] "RemoveContainer" containerID="4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.174649 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25qrp"] Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.197597 4751 scope.go:117] "RemoveContainer" containerID="161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.201479 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-25qrp"] Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.249429 4751 scope.go:117] "RemoveContainer" containerID="8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0" Dec 03 14:49:32 crc kubenswrapper[4751]: E1203 14:49:32.249868 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0\": container with ID starting with 8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0 not found: ID does not exist" containerID="8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.249909 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0"} err="failed to get container status \"8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0\": rpc error: code = NotFound desc = could not find container \"8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0\": container with ID starting with 8185e8297f7ba9271f44d1de57c10d8f7628e4d118a2b4e7c0d9bb57a14cbca0 not found: ID does not exist" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.249936 4751 scope.go:117] "RemoveContainer" containerID="4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f" Dec 03 14:49:32 crc kubenswrapper[4751]: E1203 14:49:32.250185 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f\": container with ID starting with 4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f not found: ID does not exist" containerID="4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.250224 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f"} err="failed to get container status \"4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f\": rpc error: code = NotFound desc = could not find container \"4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f\": container with ID starting with 4538aa9c83bc961a1057f5a88189b1b6f2a4f5c70ce9afbbfcf0e63d1957116f not found: ID does not exist" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.250254 4751 scope.go:117] "RemoveContainer" containerID="161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a" Dec 03 14:49:32 crc kubenswrapper[4751]: E1203 14:49:32.250593 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a\": container with ID starting with 161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a not found: ID does not exist" containerID="161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a" Dec 03 14:49:32 crc kubenswrapper[4751]: I1203 14:49:32.250645 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a"} err="failed to get container status \"161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a\": rpc error: code = NotFound desc = could not find container \"161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a\": container with ID starting with 161406fa21fd97cd4d9f1464f882aa96deebb944d9acaf4531cfcbb1a8bd931a not found: ID does not exist" Dec 03 14:49:33 crc kubenswrapper[4751]: I1203 14:49:33.328820 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" path="/var/lib/kubelet/pods/50d35e6b-6a28-4822-857e-2f8c1ac2ffd8/volumes" Dec 03 14:49:35 crc kubenswrapper[4751]: I1203 14:49:35.819796 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:49:35 crc kubenswrapper[4751]: I1203 14:49:35.819863 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:49:59 crc kubenswrapper[4751]: I1203 14:49:59.425770 4751 generic.go:334] "Generic (PLEG): container finished" podID="8eeba461-aadb-44d9-ac60-9413a2c70e6d" containerID="4fb0b85edda35cb6efb1d7ed436fffba19b844ceee71432499acb4fef5d5bf49" exitCode=0 Dec 03 14:49:59 crc kubenswrapper[4751]: I1203 14:49:59.425827 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" event={"ID":"8eeba461-aadb-44d9-ac60-9413a2c70e6d","Type":"ContainerDied","Data":"4fb0b85edda35cb6efb1d7ed436fffba19b844ceee71432499acb4fef5d5bf49"} Dec 03 14:50:00 crc kubenswrapper[4751]: I1203 14:50:00.886965 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.022116 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-inventory\") pod \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.022253 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpks4\" (UniqueName: \"kubernetes.io/projected/8eeba461-aadb-44d9-ac60-9413a2c70e6d-kube-api-access-rpks4\") pod \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.022424 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-ssh-key\") pod \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\" (UID: \"8eeba461-aadb-44d9-ac60-9413a2c70e6d\") " Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.028727 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eeba461-aadb-44d9-ac60-9413a2c70e6d-kube-api-access-rpks4" (OuterVolumeSpecName: "kube-api-access-rpks4") pod "8eeba461-aadb-44d9-ac60-9413a2c70e6d" (UID: "8eeba461-aadb-44d9-ac60-9413a2c70e6d"). InnerVolumeSpecName "kube-api-access-rpks4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.060684 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8eeba461-aadb-44d9-ac60-9413a2c70e6d" (UID: "8eeba461-aadb-44d9-ac60-9413a2c70e6d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.075684 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-inventory" (OuterVolumeSpecName: "inventory") pod "8eeba461-aadb-44d9-ac60-9413a2c70e6d" (UID: "8eeba461-aadb-44d9-ac60-9413a2c70e6d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.126215 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.126271 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpks4\" (UniqueName: \"kubernetes.io/projected/8eeba461-aadb-44d9-ac60-9413a2c70e6d-kube-api-access-rpks4\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.126293 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8eeba461-aadb-44d9-ac60-9413a2c70e6d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.451704 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" event={"ID":"8eeba461-aadb-44d9-ac60-9413a2c70e6d","Type":"ContainerDied","Data":"45be5c138eee7940401dbba9ede358fab68f11c547504eb3c156fcc2cc2caddd"} Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.451753 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blkvd" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.451752 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45be5c138eee7940401dbba9ede358fab68f11c547504eb3c156fcc2cc2caddd" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.538847 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs"] Dec 03 14:50:01 crc kubenswrapper[4751]: E1203 14:50:01.539298 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerName="extract-utilities" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.539316 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerName="extract-utilities" Dec 03 14:50:01 crc kubenswrapper[4751]: E1203 14:50:01.539365 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eeba461-aadb-44d9-ac60-9413a2c70e6d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.539376 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eeba461-aadb-44d9-ac60-9413a2c70e6d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:50:01 crc kubenswrapper[4751]: E1203 14:50:01.539391 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerName="extract-content" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.539400 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerName="extract-content" Dec 03 14:50:01 crc kubenswrapper[4751]: E1203 14:50:01.539424 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerName="registry-server" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.539433 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerName="registry-server" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.539661 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d35e6b-6a28-4822-857e-2f8c1ac2ffd8" containerName="registry-server" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.539687 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eeba461-aadb-44d9-ac60-9413a2c70e6d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.540653 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.543341 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.543535 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.543747 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.543994 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.549460 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs"] Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.636767 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmmx5\" (UniqueName: \"kubernetes.io/projected/233a8db3-fc65-4c75-81d4-552f44ee95c2-kube-api-access-zmmx5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.636856 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.636901 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.739679 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmmx5\" (UniqueName: \"kubernetes.io/projected/233a8db3-fc65-4c75-81d4-552f44ee95c2-kube-api-access-zmmx5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.739791 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.739858 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.744913 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.746912 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.768269 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmmx5\" (UniqueName: \"kubernetes.io/projected/233a8db3-fc65-4c75-81d4-552f44ee95c2-kube-api-access-zmmx5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:01 crc kubenswrapper[4751]: I1203 14:50:01.860717 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:02 crc kubenswrapper[4751]: I1203 14:50:02.250443 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs"] Dec 03 14:50:02 crc kubenswrapper[4751]: W1203 14:50:02.257384 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod233a8db3_fc65_4c75_81d4_552f44ee95c2.slice/crio-c35b42c109e2cde54276be819399d74f6c3f7e5a5d57595241732e7a4def6696 WatchSource:0}: Error finding container c35b42c109e2cde54276be819399d74f6c3f7e5a5d57595241732e7a4def6696: Status 404 returned error can't find the container with id c35b42c109e2cde54276be819399d74f6c3f7e5a5d57595241732e7a4def6696 Dec 03 14:50:02 crc kubenswrapper[4751]: I1203 14:50:02.469945 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" event={"ID":"233a8db3-fc65-4c75-81d4-552f44ee95c2","Type":"ContainerStarted","Data":"c35b42c109e2cde54276be819399d74f6c3f7e5a5d57595241732e7a4def6696"} Dec 03 14:50:03 crc kubenswrapper[4751]: I1203 14:50:03.480750 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" event={"ID":"233a8db3-fc65-4c75-81d4-552f44ee95c2","Type":"ContainerStarted","Data":"27a92dcc89eeb61c681b6d1abf96105672652ca65fa5cbee77500580151885b7"} Dec 03 14:50:03 crc kubenswrapper[4751]: I1203 14:50:03.501303 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" podStartSLOduration=2.020844345 podStartE2EDuration="2.501279335s" podCreationTimestamp="2025-12-03 14:50:01 +0000 UTC" firstStartedPulling="2025-12-03 14:50:02.260132671 +0000 UTC m=+2209.248487908" lastFinishedPulling="2025-12-03 14:50:02.740567681 +0000 UTC m=+2209.728922898" observedRunningTime="2025-12-03 14:50:03.499012164 +0000 UTC m=+2210.487367391" watchObservedRunningTime="2025-12-03 14:50:03.501279335 +0000 UTC m=+2210.489634572" Dec 03 14:50:05 crc kubenswrapper[4751]: I1203 14:50:05.820196 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:50:05 crc kubenswrapper[4751]: I1203 14:50:05.820673 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:50:05 crc kubenswrapper[4751]: I1203 14:50:05.820737 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:50:05 crc kubenswrapper[4751]: I1203 14:50:05.821688 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a6c556859f08944ddc255eca28ac397f02ede386d6011e2ad67e3baa1641a38"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:50:05 crc kubenswrapper[4751]: I1203 14:50:05.821780 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://9a6c556859f08944ddc255eca28ac397f02ede386d6011e2ad67e3baa1641a38" gracePeriod=600 Dec 03 14:50:06 crc kubenswrapper[4751]: I1203 14:50:06.519857 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="9a6c556859f08944ddc255eca28ac397f02ede386d6011e2ad67e3baa1641a38" exitCode=0 Dec 03 14:50:06 crc kubenswrapper[4751]: I1203 14:50:06.519917 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"9a6c556859f08944ddc255eca28ac397f02ede386d6011e2ad67e3baa1641a38"} Dec 03 14:50:06 crc kubenswrapper[4751]: I1203 14:50:06.520641 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e"} Dec 03 14:50:06 crc kubenswrapper[4751]: I1203 14:50:06.520686 4751 scope.go:117] "RemoveContainer" containerID="7d4fc1d83b8c3695ce824cacf39e20506d412e0d667bfed74287b81a6f8f3e45" Dec 03 14:50:40 crc kubenswrapper[4751]: I1203 14:50:40.060118 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-m2bmn"] Dec 03 14:50:40 crc kubenswrapper[4751]: I1203 14:50:40.076705 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-m2bmn"] Dec 03 14:50:41 crc kubenswrapper[4751]: I1203 14:50:41.325649 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acdd8764-947f-44ce-a5bd-4f3c139d581c" path="/var/lib/kubelet/pods/acdd8764-947f-44ce-a5bd-4f3c139d581c/volumes" Dec 03 14:50:46 crc kubenswrapper[4751]: I1203 14:50:46.032319 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-wh7hc"] Dec 03 14:50:46 crc kubenswrapper[4751]: I1203 14:50:46.042025 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-wh7hc"] Dec 03 14:50:47 crc kubenswrapper[4751]: I1203 14:50:47.331734 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5f4b94-eb6c-4ad0-b12f-237f3d87396a" path="/var/lib/kubelet/pods/4c5f4b94-eb6c-4ad0-b12f-237f3d87396a/volumes" Dec 03 14:50:54 crc kubenswrapper[4751]: I1203 14:50:54.062431 4751 generic.go:334] "Generic (PLEG): container finished" podID="233a8db3-fc65-4c75-81d4-552f44ee95c2" containerID="27a92dcc89eeb61c681b6d1abf96105672652ca65fa5cbee77500580151885b7" exitCode=0 Dec 03 14:50:54 crc kubenswrapper[4751]: I1203 14:50:54.062522 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" event={"ID":"233a8db3-fc65-4c75-81d4-552f44ee95c2","Type":"ContainerDied","Data":"27a92dcc89eeb61c681b6d1abf96105672652ca65fa5cbee77500580151885b7"} Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.680190 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.873502 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-ssh-key\") pod \"233a8db3-fc65-4c75-81d4-552f44ee95c2\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.873879 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmmx5\" (UniqueName: \"kubernetes.io/projected/233a8db3-fc65-4c75-81d4-552f44ee95c2-kube-api-access-zmmx5\") pod \"233a8db3-fc65-4c75-81d4-552f44ee95c2\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.874066 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-inventory\") pod \"233a8db3-fc65-4c75-81d4-552f44ee95c2\" (UID: \"233a8db3-fc65-4c75-81d4-552f44ee95c2\") " Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.881576 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233a8db3-fc65-4c75-81d4-552f44ee95c2-kube-api-access-zmmx5" (OuterVolumeSpecName: "kube-api-access-zmmx5") pod "233a8db3-fc65-4c75-81d4-552f44ee95c2" (UID: "233a8db3-fc65-4c75-81d4-552f44ee95c2"). InnerVolumeSpecName "kube-api-access-zmmx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.915625 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "233a8db3-fc65-4c75-81d4-552f44ee95c2" (UID: "233a8db3-fc65-4c75-81d4-552f44ee95c2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.920693 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-inventory" (OuterVolumeSpecName: "inventory") pod "233a8db3-fc65-4c75-81d4-552f44ee95c2" (UID: "233a8db3-fc65-4c75-81d4-552f44ee95c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.976267 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.976478 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmmx5\" (UniqueName: \"kubernetes.io/projected/233a8db3-fc65-4c75-81d4-552f44ee95c2-kube-api-access-zmmx5\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:55 crc kubenswrapper[4751]: I1203 14:50:55.976561 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/233a8db3-fc65-4c75-81d4-552f44ee95c2-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.082712 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" event={"ID":"233a8db3-fc65-4c75-81d4-552f44ee95c2","Type":"ContainerDied","Data":"c35b42c109e2cde54276be819399d74f6c3f7e5a5d57595241732e7a4def6696"} Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.082755 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c35b42c109e2cde54276be819399d74f6c3f7e5a5d57595241732e7a4def6696" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.082808 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.159755 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6xdd9"] Dec 03 14:50:56 crc kubenswrapper[4751]: E1203 14:50:56.160140 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233a8db3-fc65-4c75-81d4-552f44ee95c2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.160153 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="233a8db3-fc65-4c75-81d4-552f44ee95c2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.160539 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="233a8db3-fc65-4c75-81d4-552f44ee95c2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.161273 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.163275 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.163499 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.164006 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.164351 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.174731 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6xdd9"] Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.282543 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxkjl\" (UniqueName: \"kubernetes.io/projected/b8584e24-f4eb-400e-a73c-610ba6fe3a41-kube-api-access-qxkjl\") pod \"ssh-known-hosts-edpm-deployment-6xdd9\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.282641 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6xdd9\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.282713 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6xdd9\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.384477 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxkjl\" (UniqueName: \"kubernetes.io/projected/b8584e24-f4eb-400e-a73c-610ba6fe3a41-kube-api-access-qxkjl\") pod \"ssh-known-hosts-edpm-deployment-6xdd9\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.384585 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6xdd9\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.384659 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6xdd9\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.391223 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6xdd9\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.391791 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6xdd9\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.410846 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxkjl\" (UniqueName: \"kubernetes.io/projected/b8584e24-f4eb-400e-a73c-610ba6fe3a41-kube-api-access-qxkjl\") pod \"ssh-known-hosts-edpm-deployment-6xdd9\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:56 crc kubenswrapper[4751]: I1203 14:50:56.489191 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:50:57 crc kubenswrapper[4751]: I1203 14:50:57.040906 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6xdd9"] Dec 03 14:50:57 crc kubenswrapper[4751]: I1203 14:50:57.092718 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" event={"ID":"b8584e24-f4eb-400e-a73c-610ba6fe3a41","Type":"ContainerStarted","Data":"9c3f97ed77a4e8a1377a1297df6a55ee4ce9ac5c84d0ecb7804f554c36898636"} Dec 03 14:50:58 crc kubenswrapper[4751]: I1203 14:50:58.106422 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" event={"ID":"b8584e24-f4eb-400e-a73c-610ba6fe3a41","Type":"ContainerStarted","Data":"242e48baa512ca977358d993abc1f68733b093568cf7d64dd4a42ae427f2b5cc"} Dec 03 14:50:58 crc kubenswrapper[4751]: I1203 14:50:58.124688 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" podStartSLOduration=1.7277916389999999 podStartE2EDuration="2.124643628s" podCreationTimestamp="2025-12-03 14:50:56 +0000 UTC" firstStartedPulling="2025-12-03 14:50:57.045616081 +0000 UTC m=+2264.033971298" lastFinishedPulling="2025-12-03 14:50:57.44246806 +0000 UTC m=+2264.430823287" observedRunningTime="2025-12-03 14:50:58.122938203 +0000 UTC m=+2265.111293450" watchObservedRunningTime="2025-12-03 14:50:58.124643628 +0000 UTC m=+2265.112998855" Dec 03 14:51:05 crc kubenswrapper[4751]: I1203 14:51:05.187078 4751 generic.go:334] "Generic (PLEG): container finished" podID="b8584e24-f4eb-400e-a73c-610ba6fe3a41" containerID="242e48baa512ca977358d993abc1f68733b093568cf7d64dd4a42ae427f2b5cc" exitCode=0 Dec 03 14:51:05 crc kubenswrapper[4751]: I1203 14:51:05.187167 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" event={"ID":"b8584e24-f4eb-400e-a73c-610ba6fe3a41","Type":"ContainerDied","Data":"242e48baa512ca977358d993abc1f68733b093568cf7d64dd4a42ae427f2b5cc"} Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.761640 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.830453 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-inventory-0\") pod \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.830848 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-ssh-key-openstack-edpm-ipam\") pod \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.831157 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxkjl\" (UniqueName: \"kubernetes.io/projected/b8584e24-f4eb-400e-a73c-610ba6fe3a41-kube-api-access-qxkjl\") pod \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\" (UID: \"b8584e24-f4eb-400e-a73c-610ba6fe3a41\") " Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.836852 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8584e24-f4eb-400e-a73c-610ba6fe3a41-kube-api-access-qxkjl" (OuterVolumeSpecName: "kube-api-access-qxkjl") pod "b8584e24-f4eb-400e-a73c-610ba6fe3a41" (UID: "b8584e24-f4eb-400e-a73c-610ba6fe3a41"). InnerVolumeSpecName "kube-api-access-qxkjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.858687 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b8584e24-f4eb-400e-a73c-610ba6fe3a41" (UID: "b8584e24-f4eb-400e-a73c-610ba6fe3a41"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.870279 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b8584e24-f4eb-400e-a73c-610ba6fe3a41" (UID: "b8584e24-f4eb-400e-a73c-610ba6fe3a41"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.934351 4751 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.934401 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8584e24-f4eb-400e-a73c-610ba6fe3a41-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 14:51:06 crc kubenswrapper[4751]: I1203 14:51:06.934415 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxkjl\" (UniqueName: \"kubernetes.io/projected/b8584e24-f4eb-400e-a73c-610ba6fe3a41-kube-api-access-qxkjl\") on node \"crc\" DevicePath \"\"" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.212706 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" event={"ID":"b8584e24-f4eb-400e-a73c-610ba6fe3a41","Type":"ContainerDied","Data":"9c3f97ed77a4e8a1377a1297df6a55ee4ce9ac5c84d0ecb7804f554c36898636"} Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.212761 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c3f97ed77a4e8a1377a1297df6a55ee4ce9ac5c84d0ecb7804f554c36898636" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.212845 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6xdd9" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.311765 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64"] Dec 03 14:51:07 crc kubenswrapper[4751]: E1203 14:51:07.312494 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8584e24-f4eb-400e-a73c-610ba6fe3a41" containerName="ssh-known-hosts-edpm-deployment" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.312527 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8584e24-f4eb-400e-a73c-610ba6fe3a41" containerName="ssh-known-hosts-edpm-deployment" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.312891 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8584e24-f4eb-400e-a73c-610ba6fe3a41" containerName="ssh-known-hosts-edpm-deployment" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.322792 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.326042 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.326098 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.326399 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.326767 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.330173 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64"] Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.447043 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2gj64\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.447099 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5w6\" (UniqueName: \"kubernetes.io/projected/478701d4-170a-4043-97d6-6b54b753a72a-kube-api-access-5p5w6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2gj64\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.447240 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2gj64\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.549645 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2gj64\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.549705 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5w6\" (UniqueName: \"kubernetes.io/projected/478701d4-170a-4043-97d6-6b54b753a72a-kube-api-access-5p5w6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2gj64\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.549756 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2gj64\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.553593 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2gj64\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.554118 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2gj64\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.576637 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5w6\" (UniqueName: \"kubernetes.io/projected/478701d4-170a-4043-97d6-6b54b753a72a-kube-api-access-5p5w6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2gj64\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:07 crc kubenswrapper[4751]: I1203 14:51:07.652408 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:08 crc kubenswrapper[4751]: I1203 14:51:08.200952 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64"] Dec 03 14:51:08 crc kubenswrapper[4751]: I1203 14:51:08.221875 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" event={"ID":"478701d4-170a-4043-97d6-6b54b753a72a","Type":"ContainerStarted","Data":"f2c5102759c27e0ad27960c5580990cd51a4160b768c3f7805525298fe9cd348"} Dec 03 14:51:09 crc kubenswrapper[4751]: I1203 14:51:09.235118 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" event={"ID":"478701d4-170a-4043-97d6-6b54b753a72a","Type":"ContainerStarted","Data":"9ada5161f84c344c5f91014d058da476047c124449fbdf334a77ade8777ba829"} Dec 03 14:51:09 crc kubenswrapper[4751]: I1203 14:51:09.267957 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" podStartSLOduration=1.8392956630000001 podStartE2EDuration="2.267926937s" podCreationTimestamp="2025-12-03 14:51:07 +0000 UTC" firstStartedPulling="2025-12-03 14:51:08.209291061 +0000 UTC m=+2275.197646308" lastFinishedPulling="2025-12-03 14:51:08.637922345 +0000 UTC m=+2275.626277582" observedRunningTime="2025-12-03 14:51:09.251383907 +0000 UTC m=+2276.239739214" watchObservedRunningTime="2025-12-03 14:51:09.267926937 +0000 UTC m=+2276.256282194" Dec 03 14:51:17 crc kubenswrapper[4751]: I1203 14:51:17.327594 4751 generic.go:334] "Generic (PLEG): container finished" podID="478701d4-170a-4043-97d6-6b54b753a72a" containerID="9ada5161f84c344c5f91014d058da476047c124449fbdf334a77ade8777ba829" exitCode=0 Dec 03 14:51:17 crc kubenswrapper[4751]: I1203 14:51:17.342201 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" event={"ID":"478701d4-170a-4043-97d6-6b54b753a72a","Type":"ContainerDied","Data":"9ada5161f84c344c5f91014d058da476047c124449fbdf334a77ade8777ba829"} Dec 03 14:51:18 crc kubenswrapper[4751]: I1203 14:51:18.849012 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.040032 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p5w6\" (UniqueName: \"kubernetes.io/projected/478701d4-170a-4043-97d6-6b54b753a72a-kube-api-access-5p5w6\") pod \"478701d4-170a-4043-97d6-6b54b753a72a\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.040483 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-ssh-key\") pod \"478701d4-170a-4043-97d6-6b54b753a72a\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.040513 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-inventory\") pod \"478701d4-170a-4043-97d6-6b54b753a72a\" (UID: \"478701d4-170a-4043-97d6-6b54b753a72a\") " Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.046306 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478701d4-170a-4043-97d6-6b54b753a72a-kube-api-access-5p5w6" (OuterVolumeSpecName: "kube-api-access-5p5w6") pod "478701d4-170a-4043-97d6-6b54b753a72a" (UID: "478701d4-170a-4043-97d6-6b54b753a72a"). InnerVolumeSpecName "kube-api-access-5p5w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.075361 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-inventory" (OuterVolumeSpecName: "inventory") pod "478701d4-170a-4043-97d6-6b54b753a72a" (UID: "478701d4-170a-4043-97d6-6b54b753a72a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.081090 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "478701d4-170a-4043-97d6-6b54b753a72a" (UID: "478701d4-170a-4043-97d6-6b54b753a72a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.142660 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p5w6\" (UniqueName: \"kubernetes.io/projected/478701d4-170a-4043-97d6-6b54b753a72a-kube-api-access-5p5w6\") on node \"crc\" DevicePath \"\"" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.142700 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.142712 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/478701d4-170a-4043-97d6-6b54b753a72a-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.349384 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" event={"ID":"478701d4-170a-4043-97d6-6b54b753a72a","Type":"ContainerDied","Data":"f2c5102759c27e0ad27960c5580990cd51a4160b768c3f7805525298fe9cd348"} Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.349428 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c5102759c27e0ad27960c5580990cd51a4160b768c3f7805525298fe9cd348" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.349455 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2gj64" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.464012 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk"] Dec 03 14:51:19 crc kubenswrapper[4751]: E1203 14:51:19.464723 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478701d4-170a-4043-97d6-6b54b753a72a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.464749 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="478701d4-170a-4043-97d6-6b54b753a72a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.465028 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="478701d4-170a-4043-97d6-6b54b753a72a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.466062 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.475391 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.476034 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.479707 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.480725 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.547968 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk"] Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.656892 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.656952 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26w7\" (UniqueName: \"kubernetes.io/projected/a08bb04e-0d05-4153-ab50-9fde15bb421b-kube-api-access-r26w7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.657052 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.759426 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.759483 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r26w7\" (UniqueName: \"kubernetes.io/projected/a08bb04e-0d05-4153-ab50-9fde15bb421b-kube-api-access-r26w7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.759568 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.763505 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.774822 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.775570 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26w7\" (UniqueName: \"kubernetes.io/projected/a08bb04e-0d05-4153-ab50-9fde15bb421b-kube-api-access-r26w7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:19 crc kubenswrapper[4751]: I1203 14:51:19.795423 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:20 crc kubenswrapper[4751]: I1203 14:51:20.357822 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk"] Dec 03 14:51:21 crc kubenswrapper[4751]: I1203 14:51:21.367534 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" event={"ID":"a08bb04e-0d05-4153-ab50-9fde15bb421b","Type":"ContainerStarted","Data":"13189791c191ecf3841ce30fc9ee757c7e9c0584ba05e5f8bbfa52523aa1a44d"} Dec 03 14:51:21 crc kubenswrapper[4751]: I1203 14:51:21.367827 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" event={"ID":"a08bb04e-0d05-4153-ab50-9fde15bb421b","Type":"ContainerStarted","Data":"1bdda2b0d5ff9f9bd0a6ef5eb63f15b0e311233685a78ea4556c73175c35f436"} Dec 03 14:51:21 crc kubenswrapper[4751]: I1203 14:51:21.385162 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" podStartSLOduration=1.942647591 podStartE2EDuration="2.385145973s" podCreationTimestamp="2025-12-03 14:51:19 +0000 UTC" firstStartedPulling="2025-12-03 14:51:20.368729418 +0000 UTC m=+2287.357084625" lastFinishedPulling="2025-12-03 14:51:20.81122779 +0000 UTC m=+2287.799583007" observedRunningTime="2025-12-03 14:51:21.383035697 +0000 UTC m=+2288.371390914" watchObservedRunningTime="2025-12-03 14:51:21.385145973 +0000 UTC m=+2288.373501180" Dec 03 14:51:28 crc kubenswrapper[4751]: I1203 14:51:28.440925 4751 scope.go:117] "RemoveContainer" containerID="7e02254cca8413a97e385dc466a9fb31f3a4c304fbee5d201a8c28057a264138" Dec 03 14:51:28 crc kubenswrapper[4751]: I1203 14:51:28.499388 4751 scope.go:117] "RemoveContainer" containerID="5862a570a9ea02901bac005c94807dcf0903e1f886130ea96b24dfe06cedf64f" Dec 03 14:51:30 crc kubenswrapper[4751]: I1203 14:51:30.491448 4751 generic.go:334] "Generic (PLEG): container finished" podID="a08bb04e-0d05-4153-ab50-9fde15bb421b" containerID="13189791c191ecf3841ce30fc9ee757c7e9c0584ba05e5f8bbfa52523aa1a44d" exitCode=0 Dec 03 14:51:30 crc kubenswrapper[4751]: I1203 14:51:30.491545 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" event={"ID":"a08bb04e-0d05-4153-ab50-9fde15bb421b","Type":"ContainerDied","Data":"13189791c191ecf3841ce30fc9ee757c7e9c0584ba05e5f8bbfa52523aa1a44d"} Dec 03 14:51:31 crc kubenswrapper[4751]: I1203 14:51:31.983183 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.037167 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-ssh-key\") pod \"a08bb04e-0d05-4153-ab50-9fde15bb421b\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.037614 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r26w7\" (UniqueName: \"kubernetes.io/projected/a08bb04e-0d05-4153-ab50-9fde15bb421b-kube-api-access-r26w7\") pod \"a08bb04e-0d05-4153-ab50-9fde15bb421b\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.037663 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-inventory\") pod \"a08bb04e-0d05-4153-ab50-9fde15bb421b\" (UID: \"a08bb04e-0d05-4153-ab50-9fde15bb421b\") " Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.043511 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08bb04e-0d05-4153-ab50-9fde15bb421b-kube-api-access-r26w7" (OuterVolumeSpecName: "kube-api-access-r26w7") pod "a08bb04e-0d05-4153-ab50-9fde15bb421b" (UID: "a08bb04e-0d05-4153-ab50-9fde15bb421b"). InnerVolumeSpecName "kube-api-access-r26w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.071053 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-inventory" (OuterVolumeSpecName: "inventory") pod "a08bb04e-0d05-4153-ab50-9fde15bb421b" (UID: "a08bb04e-0d05-4153-ab50-9fde15bb421b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.076748 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a08bb04e-0d05-4153-ab50-9fde15bb421b" (UID: "a08bb04e-0d05-4153-ab50-9fde15bb421b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.139249 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.139288 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a08bb04e-0d05-4153-ab50-9fde15bb421b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.139303 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r26w7\" (UniqueName: \"kubernetes.io/projected/a08bb04e-0d05-4153-ab50-9fde15bb421b-kube-api-access-r26w7\") on node \"crc\" DevicePath \"\"" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.511729 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" event={"ID":"a08bb04e-0d05-4153-ab50-9fde15bb421b","Type":"ContainerDied","Data":"1bdda2b0d5ff9f9bd0a6ef5eb63f15b0e311233685a78ea4556c73175c35f436"} Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.511765 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bdda2b0d5ff9f9bd0a6ef5eb63f15b0e311233685a78ea4556c73175c35f436" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.511812 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.602430 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg"] Dec 03 14:51:32 crc kubenswrapper[4751]: E1203 14:51:32.602925 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08bb04e-0d05-4153-ab50-9fde15bb421b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.602946 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08bb04e-0d05-4153-ab50-9fde15bb421b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.603230 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08bb04e-0d05-4153-ab50-9fde15bb421b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.604413 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.610728 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.610963 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.611103 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.611099 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.611345 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.611471 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.611683 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.612189 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.624978 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg"] Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.752371 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.752448 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.752502 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.752685 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.752768 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.752957 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.753218 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.753270 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.753408 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.753567 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.753666 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqgvx\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-kube-api-access-jqgvx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.753749 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.753835 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.754005 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856170 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856229 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856258 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856281 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856307 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856328 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856387 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856453 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856473 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856497 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856533 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856564 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqgvx\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-kube-api-access-jqgvx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856583 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.856604 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.864193 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.864461 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.864684 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.864764 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.865227 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.867366 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.867477 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.867874 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.870025 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.870281 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.870846 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.870533 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.874573 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.884182 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqgvx\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-kube-api-access-jqgvx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:32 crc kubenswrapper[4751]: I1203 14:51:32.926486 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:51:33 crc kubenswrapper[4751]: I1203 14:51:33.485631 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg"] Dec 03 14:51:33 crc kubenswrapper[4751]: I1203 14:51:33.494058 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:51:33 crc kubenswrapper[4751]: I1203 14:51:33.523744 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" event={"ID":"c1564210-8ace-4588-a706-3c7583ea0568","Type":"ContainerStarted","Data":"79cf0416b379725a892d570017474333213430c54f367a039e5a3f89581218bd"} Dec 03 14:51:34 crc kubenswrapper[4751]: I1203 14:51:34.534203 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" event={"ID":"c1564210-8ace-4588-a706-3c7583ea0568","Type":"ContainerStarted","Data":"1b4cd41153900b29874423dad114dfad09442165f5ab8c9a52522115ee345b22"} Dec 03 14:51:34 crc kubenswrapper[4751]: I1203 14:51:34.556636 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" podStartSLOduration=1.865541374 podStartE2EDuration="2.556609428s" podCreationTimestamp="2025-12-03 14:51:32 +0000 UTC" firstStartedPulling="2025-12-03 14:51:33.493651577 +0000 UTC m=+2300.482006824" lastFinishedPulling="2025-12-03 14:51:34.184719631 +0000 UTC m=+2301.173074878" observedRunningTime="2025-12-03 14:51:34.55254575 +0000 UTC m=+2301.540900977" watchObservedRunningTime="2025-12-03 14:51:34.556609428 +0000 UTC m=+2301.544964645" Dec 03 14:52:13 crc kubenswrapper[4751]: I1203 14:52:13.996227 4751 generic.go:334] "Generic (PLEG): container finished" podID="c1564210-8ace-4588-a706-3c7583ea0568" containerID="1b4cd41153900b29874423dad114dfad09442165f5ab8c9a52522115ee345b22" exitCode=0 Dec 03 14:52:13 crc kubenswrapper[4751]: I1203 14:52:13.996720 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" event={"ID":"c1564210-8ace-4588-a706-3c7583ea0568","Type":"ContainerDied","Data":"1b4cd41153900b29874423dad114dfad09442165f5ab8c9a52522115ee345b22"} Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.642151 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.743965 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-telemetry-combined-ca-bundle\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744016 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744092 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-ovn-default-certs-0\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744158 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-inventory\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744195 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-repo-setup-combined-ca-bundle\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744224 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-libvirt-combined-ca-bundle\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744315 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ovn-combined-ca-bundle\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744359 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqgvx\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-kube-api-access-jqgvx\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744415 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-neutron-metadata-combined-ca-bundle\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744494 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ssh-key\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744567 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-nova-combined-ca-bundle\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744643 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744672 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-bootstrap-combined-ca-bundle\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.744723 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"c1564210-8ace-4588-a706-3c7583ea0568\" (UID: \"c1564210-8ace-4588-a706-3c7583ea0568\") " Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.752960 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.753034 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.753125 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.753204 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.753256 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.753283 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-kube-api-access-jqgvx" (OuterVolumeSpecName: "kube-api-access-jqgvx") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "kube-api-access-jqgvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.753313 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.755695 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.755757 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.755762 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.756320 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.777557 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.778015 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-inventory" (OuterVolumeSpecName: "inventory") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.785193 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c1564210-8ace-4588-a706-3c7583ea0568" (UID: "c1564210-8ace-4588-a706-3c7583ea0568"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848208 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848246 4751 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848266 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848278 4751 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848292 4751 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848305 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqgvx\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-kube-api-access-jqgvx\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848315 4751 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848348 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848358 4751 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848370 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848382 4751 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848393 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848404 4751 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1564210-8ace-4588-a706-3c7583ea0568-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:15 crc kubenswrapper[4751]: I1203 14:52:15.848416 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c1564210-8ace-4588-a706-3c7583ea0568-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.021897 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" event={"ID":"c1564210-8ace-4588-a706-3c7583ea0568","Type":"ContainerDied","Data":"79cf0416b379725a892d570017474333213430c54f367a039e5a3f89581218bd"} Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.022252 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79cf0416b379725a892d570017474333213430c54f367a039e5a3f89581218bd" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.021958 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.130852 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf"] Dec 03 14:52:16 crc kubenswrapper[4751]: E1203 14:52:16.131392 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1564210-8ace-4588-a706-3c7583ea0568" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.131411 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1564210-8ace-4588-a706-3c7583ea0568" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.131725 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1564210-8ace-4588-a706-3c7583ea0568" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.132675 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.136304 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.136488 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.136677 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.136687 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.136805 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.142590 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf"] Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.257814 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jndjf\" (UniqueName: \"kubernetes.io/projected/5d9c6feb-6018-476a-b029-e4df05b4566d-kube-api-access-jndjf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.257863 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5d9c6feb-6018-476a-b029-e4df05b4566d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.257934 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.257996 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.258018 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.360500 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jndjf\" (UniqueName: \"kubernetes.io/projected/5d9c6feb-6018-476a-b029-e4df05b4566d-kube-api-access-jndjf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.360544 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5d9c6feb-6018-476a-b029-e4df05b4566d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.360602 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.360695 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.360724 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.361608 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5d9c6feb-6018-476a-b029-e4df05b4566d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.364979 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.365193 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.371893 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.385921 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jndjf\" (UniqueName: \"kubernetes.io/projected/5d9c6feb-6018-476a-b029-e4df05b4566d-kube-api-access-jndjf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h82lf\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:16 crc kubenswrapper[4751]: I1203 14:52:16.500061 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:52:17 crc kubenswrapper[4751]: I1203 14:52:17.061205 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf"] Dec 03 14:52:18 crc kubenswrapper[4751]: I1203 14:52:18.041709 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" event={"ID":"5d9c6feb-6018-476a-b029-e4df05b4566d","Type":"ContainerStarted","Data":"7df554860eb13587f07566005c4cf9d157cb8d245af5254925a29e5cea040540"} Dec 03 14:52:18 crc kubenswrapper[4751]: I1203 14:52:18.042294 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" event={"ID":"5d9c6feb-6018-476a-b029-e4df05b4566d","Type":"ContainerStarted","Data":"288f5e18baa86b152adf3feb7275ceffaac4fbb4f12d596ac1119f184d86b163"} Dec 03 14:52:35 crc kubenswrapper[4751]: I1203 14:52:35.820298 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:52:35 crc kubenswrapper[4751]: I1203 14:52:35.820928 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:53:05 crc kubenswrapper[4751]: I1203 14:53:05.821781 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:53:05 crc kubenswrapper[4751]: I1203 14:53:05.822346 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:53:22 crc kubenswrapper[4751]: I1203 14:53:22.795736 4751 generic.go:334] "Generic (PLEG): container finished" podID="5d9c6feb-6018-476a-b029-e4df05b4566d" containerID="7df554860eb13587f07566005c4cf9d157cb8d245af5254925a29e5cea040540" exitCode=0 Dec 03 14:53:22 crc kubenswrapper[4751]: I1203 14:53:22.795776 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" event={"ID":"5d9c6feb-6018-476a-b029-e4df05b4566d","Type":"ContainerDied","Data":"7df554860eb13587f07566005c4cf9d157cb8d245af5254925a29e5cea040540"} Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.353962 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.379562 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ovn-combined-ca-bundle\") pod \"5d9c6feb-6018-476a-b029-e4df05b4566d\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.385695 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5d9c6feb-6018-476a-b029-e4df05b4566d" (UID: "5d9c6feb-6018-476a-b029-e4df05b4566d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.481346 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-inventory\") pod \"5d9c6feb-6018-476a-b029-e4df05b4566d\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.481402 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ssh-key\") pod \"5d9c6feb-6018-476a-b029-e4df05b4566d\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.481447 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jndjf\" (UniqueName: \"kubernetes.io/projected/5d9c6feb-6018-476a-b029-e4df05b4566d-kube-api-access-jndjf\") pod \"5d9c6feb-6018-476a-b029-e4df05b4566d\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.481496 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5d9c6feb-6018-476a-b029-e4df05b4566d-ovncontroller-config-0\") pod \"5d9c6feb-6018-476a-b029-e4df05b4566d\" (UID: \"5d9c6feb-6018-476a-b029-e4df05b4566d\") " Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.481866 4751 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.485588 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9c6feb-6018-476a-b029-e4df05b4566d-kube-api-access-jndjf" (OuterVolumeSpecName: "kube-api-access-jndjf") pod "5d9c6feb-6018-476a-b029-e4df05b4566d" (UID: "5d9c6feb-6018-476a-b029-e4df05b4566d"). InnerVolumeSpecName "kube-api-access-jndjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.508214 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5d9c6feb-6018-476a-b029-e4df05b4566d" (UID: "5d9c6feb-6018-476a-b029-e4df05b4566d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.509914 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-inventory" (OuterVolumeSpecName: "inventory") pod "5d9c6feb-6018-476a-b029-e4df05b4566d" (UID: "5d9c6feb-6018-476a-b029-e4df05b4566d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.513104 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d9c6feb-6018-476a-b029-e4df05b4566d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5d9c6feb-6018-476a-b029-e4df05b4566d" (UID: "5d9c6feb-6018-476a-b029-e4df05b4566d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.585564 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.585590 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d9c6feb-6018-476a-b029-e4df05b4566d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.585600 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jndjf\" (UniqueName: \"kubernetes.io/projected/5d9c6feb-6018-476a-b029-e4df05b4566d-kube-api-access-jndjf\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.585611 4751 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5d9c6feb-6018-476a-b029-e4df05b4566d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.819213 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" event={"ID":"5d9c6feb-6018-476a-b029-e4df05b4566d","Type":"ContainerDied","Data":"288f5e18baa86b152adf3feb7275ceffaac4fbb4f12d596ac1119f184d86b163"} Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.819256 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="288f5e18baa86b152adf3feb7275ceffaac4fbb4f12d596ac1119f184d86b163" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.819264 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h82lf" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.919117 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll"] Dec 03 14:53:24 crc kubenswrapper[4751]: E1203 14:53:24.919848 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9c6feb-6018-476a-b029-e4df05b4566d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.919870 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9c6feb-6018-476a-b029-e4df05b4566d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.920148 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9c6feb-6018-476a-b029-e4df05b4566d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.921012 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.925205 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.925281 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.925208 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.925411 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.925889 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.931877 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:53:24 crc kubenswrapper[4751]: I1203 14:53:24.936991 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll"] Dec 03 14:53:25 crc kubenswrapper[4751]: E1203 14:53:25.037993 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d9c6feb_6018_476a_b029_e4df05b4566d.slice\": RecentStats: unable to find data in memory cache]" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.097763 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.097892 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.097944 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.098020 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndpmw\" (UniqueName: \"kubernetes.io/projected/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-kube-api-access-ndpmw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.098073 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.098121 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.200697 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.200819 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.200923 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndpmw\" (UniqueName: \"kubernetes.io/projected/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-kube-api-access-ndpmw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.200985 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.201057 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.201160 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.206583 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.206879 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.207001 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.207206 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.209352 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.220603 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndpmw\" (UniqueName: \"kubernetes.io/projected/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-kube-api-access-ndpmw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.246096 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:53:25 crc kubenswrapper[4751]: I1203 14:53:25.848121 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll"] Dec 03 14:53:26 crc kubenswrapper[4751]: I1203 14:53:26.842513 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" event={"ID":"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb","Type":"ContainerStarted","Data":"feb4c93159a8a10c5f52d435b1860d471197d0a82790153bb520f583d38a60e6"} Dec 03 14:53:26 crc kubenswrapper[4751]: I1203 14:53:26.842841 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" event={"ID":"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb","Type":"ContainerStarted","Data":"e9e79941a2cd215bfe571c045b45c05c1546a26c417b8ae8ca4e3af66caf1bd2"} Dec 03 14:53:26 crc kubenswrapper[4751]: I1203 14:53:26.870962 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" podStartSLOduration=2.378532288 podStartE2EDuration="2.870934933s" podCreationTimestamp="2025-12-03 14:53:24 +0000 UTC" firstStartedPulling="2025-12-03 14:53:25.851688029 +0000 UTC m=+2412.840043246" lastFinishedPulling="2025-12-03 14:53:26.344090644 +0000 UTC m=+2413.332445891" observedRunningTime="2025-12-03 14:53:26.860796511 +0000 UTC m=+2413.849151778" watchObservedRunningTime="2025-12-03 14:53:26.870934933 +0000 UTC m=+2413.859290170" Dec 03 14:53:35 crc kubenswrapper[4751]: I1203 14:53:35.819723 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:53:35 crc kubenswrapper[4751]: I1203 14:53:35.820420 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:53:35 crc kubenswrapper[4751]: I1203 14:53:35.820487 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 14:53:35 crc kubenswrapper[4751]: I1203 14:53:35.821581 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:53:35 crc kubenswrapper[4751]: I1203 14:53:35.821824 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" gracePeriod=600 Dec 03 14:53:35 crc kubenswrapper[4751]: E1203 14:53:35.964860 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:53:36 crc kubenswrapper[4751]: I1203 14:53:36.947159 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" exitCode=0 Dec 03 14:53:36 crc kubenswrapper[4751]: I1203 14:53:36.947207 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e"} Dec 03 14:53:36 crc kubenswrapper[4751]: I1203 14:53:36.947257 4751 scope.go:117] "RemoveContainer" containerID="9a6c556859f08944ddc255eca28ac397f02ede386d6011e2ad67e3baa1641a38" Dec 03 14:53:36 crc kubenswrapper[4751]: I1203 14:53:36.948208 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:53:36 crc kubenswrapper[4751]: E1203 14:53:36.948804 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:53:47 crc kubenswrapper[4751]: I1203 14:53:47.314254 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:53:47 crc kubenswrapper[4751]: E1203 14:53:47.315205 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:54:02 crc kubenswrapper[4751]: I1203 14:54:02.314458 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:54:02 crc kubenswrapper[4751]: E1203 14:54:02.316354 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:54:16 crc kubenswrapper[4751]: I1203 14:54:16.358298 4751 generic.go:334] "Generic (PLEG): container finished" podID="521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" containerID="feb4c93159a8a10c5f52d435b1860d471197d0a82790153bb520f583d38a60e6" exitCode=0 Dec 03 14:54:16 crc kubenswrapper[4751]: I1203 14:54:16.358467 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" event={"ID":"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb","Type":"ContainerDied","Data":"feb4c93159a8a10c5f52d435b1860d471197d0a82790153bb520f583d38a60e6"} Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.314381 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:54:17 crc kubenswrapper[4751]: E1203 14:54:17.315074 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.868938 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.919313 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-metadata-combined-ca-bundle\") pod \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.919460 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-inventory\") pod \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.919492 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-nova-metadata-neutron-config-0\") pod \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.919544 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-ssh-key\") pod \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.919629 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndpmw\" (UniqueName: \"kubernetes.io/projected/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-kube-api-access-ndpmw\") pod \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.919651 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\" (UID: \"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb\") " Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.926072 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-kube-api-access-ndpmw" (OuterVolumeSpecName: "kube-api-access-ndpmw") pod "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" (UID: "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb"). InnerVolumeSpecName "kube-api-access-ndpmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.926741 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" (UID: "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.949016 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" (UID: "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.951911 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" (UID: "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.956144 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" (UID: "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:54:17 crc kubenswrapper[4751]: I1203 14:54:17.969975 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-inventory" (OuterVolumeSpecName: "inventory") pod "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" (UID: "521c9f69-c59e-4b93-a1a2-ab687b7ee6eb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.029761 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndpmw\" (UniqueName: \"kubernetes.io/projected/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-kube-api-access-ndpmw\") on node \"crc\" DevicePath \"\"" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.029814 4751 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.029838 4751 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.029858 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.029877 4751 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.029895 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/521c9f69-c59e-4b93-a1a2-ab687b7ee6eb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.380183 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" event={"ID":"521c9f69-c59e-4b93-a1a2-ab687b7ee6eb","Type":"ContainerDied","Data":"e9e79941a2cd215bfe571c045b45c05c1546a26c417b8ae8ca4e3af66caf1bd2"} Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.380266 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e79941a2cd215bfe571c045b45c05c1546a26c417b8ae8ca4e3af66caf1bd2" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.380320 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.500320 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn"] Dec 03 14:54:18 crc kubenswrapper[4751]: E1203 14:54:18.500843 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.500867 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.501108 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="521c9f69-c59e-4b93-a1a2-ab687b7ee6eb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.501878 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.503904 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.504212 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.504506 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.504767 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.504975 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.515542 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn"] Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.540320 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgt7\" (UniqueName: \"kubernetes.io/projected/607ac64e-604b-407d-9939-b8f2ba0832c5-kube-api-access-vjgt7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.540475 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.540631 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.540696 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.540792 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.642941 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgt7\" (UniqueName: \"kubernetes.io/projected/607ac64e-604b-407d-9939-b8f2ba0832c5-kube-api-access-vjgt7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.643005 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.643235 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.643262 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.643309 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.647646 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.647689 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.647902 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.657127 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.659092 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgt7\" (UniqueName: \"kubernetes.io/projected/607ac64e-604b-407d-9939-b8f2ba0832c5-kube-api-access-vjgt7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:18 crc kubenswrapper[4751]: I1203 14:54:18.827524 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:54:19 crc kubenswrapper[4751]: I1203 14:54:19.365785 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn"] Dec 03 14:54:19 crc kubenswrapper[4751]: W1203 14:54:19.370424 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod607ac64e_604b_407d_9939_b8f2ba0832c5.slice/crio-48680e94a3feeddc0462600d00e0ccd3a5cefd296b0d57edf0178c5e5e871b0d WatchSource:0}: Error finding container 48680e94a3feeddc0462600d00e0ccd3a5cefd296b0d57edf0178c5e5e871b0d: Status 404 returned error can't find the container with id 48680e94a3feeddc0462600d00e0ccd3a5cefd296b0d57edf0178c5e5e871b0d Dec 03 14:54:19 crc kubenswrapper[4751]: I1203 14:54:19.416224 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" event={"ID":"607ac64e-604b-407d-9939-b8f2ba0832c5","Type":"ContainerStarted","Data":"48680e94a3feeddc0462600d00e0ccd3a5cefd296b0d57edf0178c5e5e871b0d"} Dec 03 14:54:20 crc kubenswrapper[4751]: I1203 14:54:20.432129 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" event={"ID":"607ac64e-604b-407d-9939-b8f2ba0832c5","Type":"ContainerStarted","Data":"d51758b8ef533e026fcf43ac76dbdf719d93b779254ec21132acf499942d7833"} Dec 03 14:54:20 crc kubenswrapper[4751]: I1203 14:54:20.453972 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" podStartSLOduration=1.9308820519999998 podStartE2EDuration="2.45395164s" podCreationTimestamp="2025-12-03 14:54:18 +0000 UTC" firstStartedPulling="2025-12-03 14:54:19.373899084 +0000 UTC m=+2466.362254301" lastFinishedPulling="2025-12-03 14:54:19.896968662 +0000 UTC m=+2466.885323889" observedRunningTime="2025-12-03 14:54:20.448133343 +0000 UTC m=+2467.436488560" watchObservedRunningTime="2025-12-03 14:54:20.45395164 +0000 UTC m=+2467.442306857" Dec 03 14:54:31 crc kubenswrapper[4751]: I1203 14:54:31.314385 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:54:31 crc kubenswrapper[4751]: E1203 14:54:31.315221 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:54:46 crc kubenswrapper[4751]: I1203 14:54:46.313990 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:54:46 crc kubenswrapper[4751]: E1203 14:54:46.314884 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:54:55 crc kubenswrapper[4751]: I1203 14:54:55.663486 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84ff798d87-5c96l" podUID="9c7e0fc7-03ed-4002-b460-df87d151f563" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 03 14:54:57 crc kubenswrapper[4751]: I1203 14:54:57.313846 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:54:57 crc kubenswrapper[4751]: E1203 14:54:57.314437 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:55:08 crc kubenswrapper[4751]: I1203 14:55:08.314901 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:55:08 crc kubenswrapper[4751]: E1203 14:55:08.315742 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:55:19 crc kubenswrapper[4751]: I1203 14:55:19.314610 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:55:19 crc kubenswrapper[4751]: E1203 14:55:19.315782 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:55:33 crc kubenswrapper[4751]: I1203 14:55:33.320081 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:55:33 crc kubenswrapper[4751]: E1203 14:55:33.321108 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:55:45 crc kubenswrapper[4751]: I1203 14:55:45.314456 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:55:45 crc kubenswrapper[4751]: E1203 14:55:45.315220 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:55:57 crc kubenswrapper[4751]: I1203 14:55:57.314292 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:55:57 crc kubenswrapper[4751]: E1203 14:55:57.315477 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:56:11 crc kubenswrapper[4751]: I1203 14:56:11.314564 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:56:11 crc kubenswrapper[4751]: E1203 14:56:11.315435 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:56:25 crc kubenswrapper[4751]: I1203 14:56:25.314475 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:56:25 crc kubenswrapper[4751]: E1203 14:56:25.315347 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:56:38 crc kubenswrapper[4751]: I1203 14:56:38.313750 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:56:38 crc kubenswrapper[4751]: E1203 14:56:38.314388 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:56:53 crc kubenswrapper[4751]: I1203 14:56:53.321898 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:56:53 crc kubenswrapper[4751]: E1203 14:56:53.322698 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:57:08 crc kubenswrapper[4751]: I1203 14:57:08.314997 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:57:08 crc kubenswrapper[4751]: E1203 14:57:08.316013 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:57:16 crc kubenswrapper[4751]: I1203 14:57:16.805560 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dm6hs"] Dec 03 14:57:16 crc kubenswrapper[4751]: I1203 14:57:16.808677 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:16 crc kubenswrapper[4751]: I1203 14:57:16.820400 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dm6hs"] Dec 03 14:57:16 crc kubenswrapper[4751]: I1203 14:57:16.936146 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-utilities\") pod \"certified-operators-dm6hs\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:16 crc kubenswrapper[4751]: I1203 14:57:16.936246 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgk6n\" (UniqueName: \"kubernetes.io/projected/c7784175-a1e2-4f8d-81a1-821b914d08e0-kube-api-access-sgk6n\") pod \"certified-operators-dm6hs\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:16 crc kubenswrapper[4751]: I1203 14:57:16.936294 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-catalog-content\") pod \"certified-operators-dm6hs\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:17 crc kubenswrapper[4751]: I1203 14:57:17.038308 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-catalog-content\") pod \"certified-operators-dm6hs\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:17 crc kubenswrapper[4751]: I1203 14:57:17.038494 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-utilities\") pod \"certified-operators-dm6hs\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:17 crc kubenswrapper[4751]: I1203 14:57:17.038579 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgk6n\" (UniqueName: \"kubernetes.io/projected/c7784175-a1e2-4f8d-81a1-821b914d08e0-kube-api-access-sgk6n\") pod \"certified-operators-dm6hs\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:17 crc kubenswrapper[4751]: I1203 14:57:17.038926 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-catalog-content\") pod \"certified-operators-dm6hs\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:17 crc kubenswrapper[4751]: I1203 14:57:17.039120 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-utilities\") pod \"certified-operators-dm6hs\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:17 crc kubenswrapper[4751]: I1203 14:57:17.063804 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgk6n\" (UniqueName: \"kubernetes.io/projected/c7784175-a1e2-4f8d-81a1-821b914d08e0-kube-api-access-sgk6n\") pod \"certified-operators-dm6hs\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:17 crc kubenswrapper[4751]: I1203 14:57:17.151969 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:17 crc kubenswrapper[4751]: I1203 14:57:17.705756 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dm6hs"] Dec 03 14:57:18 crc kubenswrapper[4751]: I1203 14:57:18.368602 4751 generic.go:334] "Generic (PLEG): container finished" podID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerID="bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e" exitCode=0 Dec 03 14:57:18 crc kubenswrapper[4751]: I1203 14:57:18.368733 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm6hs" event={"ID":"c7784175-a1e2-4f8d-81a1-821b914d08e0","Type":"ContainerDied","Data":"bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e"} Dec 03 14:57:18 crc kubenswrapper[4751]: I1203 14:57:18.368874 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm6hs" event={"ID":"c7784175-a1e2-4f8d-81a1-821b914d08e0","Type":"ContainerStarted","Data":"30dcc2b00a69695edbf9bbcd70ebb5fd4b27a63f7b15f65ceb68c7a6e55e2185"} Dec 03 14:57:18 crc kubenswrapper[4751]: I1203 14:57:18.370491 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:57:19 crc kubenswrapper[4751]: I1203 14:57:19.314483 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:57:19 crc kubenswrapper[4751]: E1203 14:57:19.315013 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:57:19 crc kubenswrapper[4751]: I1203 14:57:19.385201 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm6hs" event={"ID":"c7784175-a1e2-4f8d-81a1-821b914d08e0","Type":"ContainerStarted","Data":"f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1"} Dec 03 14:57:20 crc kubenswrapper[4751]: I1203 14:57:20.398211 4751 generic.go:334] "Generic (PLEG): container finished" podID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerID="f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1" exitCode=0 Dec 03 14:57:20 crc kubenswrapper[4751]: I1203 14:57:20.398315 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm6hs" event={"ID":"c7784175-a1e2-4f8d-81a1-821b914d08e0","Type":"ContainerDied","Data":"f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1"} Dec 03 14:57:21 crc kubenswrapper[4751]: I1203 14:57:21.448484 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm6hs" event={"ID":"c7784175-a1e2-4f8d-81a1-821b914d08e0","Type":"ContainerStarted","Data":"2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7"} Dec 03 14:57:21 crc kubenswrapper[4751]: I1203 14:57:21.468857 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dm6hs" podStartSLOduration=2.798153134 podStartE2EDuration="5.468841996s" podCreationTimestamp="2025-12-03 14:57:16 +0000 UTC" firstStartedPulling="2025-12-03 14:57:18.37025872 +0000 UTC m=+2645.358613937" lastFinishedPulling="2025-12-03 14:57:21.040947572 +0000 UTC m=+2648.029302799" observedRunningTime="2025-12-03 14:57:21.466643537 +0000 UTC m=+2648.454998784" watchObservedRunningTime="2025-12-03 14:57:21.468841996 +0000 UTC m=+2648.457197203" Dec 03 14:57:27 crc kubenswrapper[4751]: I1203 14:57:27.153482 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:27 crc kubenswrapper[4751]: I1203 14:57:27.153966 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:27 crc kubenswrapper[4751]: I1203 14:57:27.230417 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:27 crc kubenswrapper[4751]: I1203 14:57:27.572776 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:27 crc kubenswrapper[4751]: I1203 14:57:27.634877 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dm6hs"] Dec 03 14:57:29 crc kubenswrapper[4751]: I1203 14:57:29.539776 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dm6hs" podUID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerName="registry-server" containerID="cri-o://2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7" gracePeriod=2 Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.301857 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.443480 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-utilities\") pod \"c7784175-a1e2-4f8d-81a1-821b914d08e0\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.443530 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgk6n\" (UniqueName: \"kubernetes.io/projected/c7784175-a1e2-4f8d-81a1-821b914d08e0-kube-api-access-sgk6n\") pod \"c7784175-a1e2-4f8d-81a1-821b914d08e0\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.443556 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-catalog-content\") pod \"c7784175-a1e2-4f8d-81a1-821b914d08e0\" (UID: \"c7784175-a1e2-4f8d-81a1-821b914d08e0\") " Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.444610 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-utilities" (OuterVolumeSpecName: "utilities") pod "c7784175-a1e2-4f8d-81a1-821b914d08e0" (UID: "c7784175-a1e2-4f8d-81a1-821b914d08e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.450590 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7784175-a1e2-4f8d-81a1-821b914d08e0-kube-api-access-sgk6n" (OuterVolumeSpecName: "kube-api-access-sgk6n") pod "c7784175-a1e2-4f8d-81a1-821b914d08e0" (UID: "c7784175-a1e2-4f8d-81a1-821b914d08e0"). InnerVolumeSpecName "kube-api-access-sgk6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.495769 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7784175-a1e2-4f8d-81a1-821b914d08e0" (UID: "c7784175-a1e2-4f8d-81a1-821b914d08e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.546699 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.546746 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgk6n\" (UniqueName: \"kubernetes.io/projected/c7784175-a1e2-4f8d-81a1-821b914d08e0-kube-api-access-sgk6n\") on node \"crc\" DevicePath \"\"" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.546761 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7784175-a1e2-4f8d-81a1-821b914d08e0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.650270 4751 generic.go:334] "Generic (PLEG): container finished" podID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerID="2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7" exitCode=0 Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.650320 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm6hs" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.650359 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm6hs" event={"ID":"c7784175-a1e2-4f8d-81a1-821b914d08e0","Type":"ContainerDied","Data":"2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7"} Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.650610 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm6hs" event={"ID":"c7784175-a1e2-4f8d-81a1-821b914d08e0","Type":"ContainerDied","Data":"30dcc2b00a69695edbf9bbcd70ebb5fd4b27a63f7b15f65ceb68c7a6e55e2185"} Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.650674 4751 scope.go:117] "RemoveContainer" containerID="2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.698514 4751 scope.go:117] "RemoveContainer" containerID="f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.728380 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dm6hs"] Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.740475 4751 scope.go:117] "RemoveContainer" containerID="bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.740899 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dm6hs"] Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.803018 4751 scope.go:117] "RemoveContainer" containerID="2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7" Dec 03 14:57:31 crc kubenswrapper[4751]: E1203 14:57:31.806785 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7\": container with ID starting with 2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7 not found: ID does not exist" containerID="2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.806841 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7"} err="failed to get container status \"2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7\": rpc error: code = NotFound desc = could not find container \"2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7\": container with ID starting with 2705fbd7a775d90476a96a67f381191a59e708eec8eb059dbee4e186bdf903f7 not found: ID does not exist" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.806871 4751 scope.go:117] "RemoveContainer" containerID="f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1" Dec 03 14:57:31 crc kubenswrapper[4751]: E1203 14:57:31.807214 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1\": container with ID starting with f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1 not found: ID does not exist" containerID="f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.807240 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1"} err="failed to get container status \"f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1\": rpc error: code = NotFound desc = could not find container \"f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1\": container with ID starting with f0ec52fb2fa6b60a6743d418fbe98dded02fd3d65112d860d75f7c947d2cc1d1 not found: ID does not exist" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.807254 4751 scope.go:117] "RemoveContainer" containerID="bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e" Dec 03 14:57:31 crc kubenswrapper[4751]: E1203 14:57:31.807485 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e\": container with ID starting with bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e not found: ID does not exist" containerID="bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e" Dec 03 14:57:31 crc kubenswrapper[4751]: I1203 14:57:31.807510 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e"} err="failed to get container status \"bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e\": rpc error: code = NotFound desc = could not find container \"bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e\": container with ID starting with bc3e82840003cd58705c7094f030a39eb1478db40cc7afc398622b6117b3d39e not found: ID does not exist" Dec 03 14:57:31 crc kubenswrapper[4751]: E1203 14:57:31.877806 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7784175_a1e2_4f8d_81a1_821b914d08e0.slice/crio-30dcc2b00a69695edbf9bbcd70ebb5fd4b27a63f7b15f65ceb68c7a6e55e2185\": RecentStats: unable to find data in memory cache]" Dec 03 14:57:33 crc kubenswrapper[4751]: I1203 14:57:33.326960 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:57:33 crc kubenswrapper[4751]: E1203 14:57:33.327700 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:57:33 crc kubenswrapper[4751]: I1203 14:57:33.334049 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7784175-a1e2-4f8d-81a1-821b914d08e0" path="/var/lib/kubelet/pods/c7784175-a1e2-4f8d-81a1-821b914d08e0/volumes" Dec 03 14:57:47 crc kubenswrapper[4751]: I1203 14:57:47.313975 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:57:47 crc kubenswrapper[4751]: E1203 14:57:47.314752 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:58:01 crc kubenswrapper[4751]: I1203 14:58:01.314432 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:58:01 crc kubenswrapper[4751]: E1203 14:58:01.315120 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:58:16 crc kubenswrapper[4751]: I1203 14:58:16.314708 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:58:16 crc kubenswrapper[4751]: E1203 14:58:16.316004 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:58:31 crc kubenswrapper[4751]: I1203 14:58:31.314286 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:58:31 crc kubenswrapper[4751]: E1203 14:58:31.315512 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.386664 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xxn82"] Dec 03 14:58:41 crc kubenswrapper[4751]: E1203 14:58:41.387593 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerName="registry-server" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.387608 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerName="registry-server" Dec 03 14:58:41 crc kubenswrapper[4751]: E1203 14:58:41.387622 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerName="extract-utilities" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.387629 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerName="extract-utilities" Dec 03 14:58:41 crc kubenswrapper[4751]: E1203 14:58:41.387656 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerName="extract-content" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.387663 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerName="extract-content" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.387889 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7784175-a1e2-4f8d-81a1-821b914d08e0" containerName="registry-server" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.389567 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.398869 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxn82"] Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.473903 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-catalog-content\") pod \"community-operators-xxn82\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.474074 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dctvh\" (UniqueName: \"kubernetes.io/projected/063f705f-e754-44f2-9132-091b1438e3d1-kube-api-access-dctvh\") pod \"community-operators-xxn82\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.474101 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-utilities\") pod \"community-operators-xxn82\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.575405 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-catalog-content\") pod \"community-operators-xxn82\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.575603 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dctvh\" (UniqueName: \"kubernetes.io/projected/063f705f-e754-44f2-9132-091b1438e3d1-kube-api-access-dctvh\") pod \"community-operators-xxn82\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.575635 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-utilities\") pod \"community-operators-xxn82\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.576375 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-utilities\") pod \"community-operators-xxn82\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.576608 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-catalog-content\") pod \"community-operators-xxn82\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.600788 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dctvh\" (UniqueName: \"kubernetes.io/projected/063f705f-e754-44f2-9132-091b1438e3d1-kube-api-access-dctvh\") pod \"community-operators-xxn82\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:41 crc kubenswrapper[4751]: I1203 14:58:41.709011 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:42 crc kubenswrapper[4751]: I1203 14:58:42.294691 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxn82"] Dec 03 14:58:42 crc kubenswrapper[4751]: I1203 14:58:42.487773 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn82" event={"ID":"063f705f-e754-44f2-9132-091b1438e3d1","Type":"ContainerStarted","Data":"05addc3f7c6cf1f8cd8b3864317d41e301ce29a44a49b0daafe76513e0817728"} Dec 03 14:58:43 crc kubenswrapper[4751]: I1203 14:58:43.325464 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 14:58:43 crc kubenswrapper[4751]: I1203 14:58:43.499717 4751 generic.go:334] "Generic (PLEG): container finished" podID="063f705f-e754-44f2-9132-091b1438e3d1" containerID="8947279a589233565b6f20e7f47065b0d0646b557850a6dacc348ed2fa7c850c" exitCode=0 Dec 03 14:58:43 crc kubenswrapper[4751]: I1203 14:58:43.499809 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn82" event={"ID":"063f705f-e754-44f2-9132-091b1438e3d1","Type":"ContainerDied","Data":"8947279a589233565b6f20e7f47065b0d0646b557850a6dacc348ed2fa7c850c"} Dec 03 14:58:44 crc kubenswrapper[4751]: I1203 14:58:44.511870 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn82" event={"ID":"063f705f-e754-44f2-9132-091b1438e3d1","Type":"ContainerStarted","Data":"fd054ea483f718b65b8ff91a7936653ba084187e187cd307b8d4195357c51f94"} Dec 03 14:58:44 crc kubenswrapper[4751]: I1203 14:58:44.514593 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"ba0487b3f47045f8ec369ac1580a04dfd70d8b12f7a6308ce9834ccd20eec3be"} Dec 03 14:58:45 crc kubenswrapper[4751]: I1203 14:58:45.530554 4751 generic.go:334] "Generic (PLEG): container finished" podID="063f705f-e754-44f2-9132-091b1438e3d1" containerID="fd054ea483f718b65b8ff91a7936653ba084187e187cd307b8d4195357c51f94" exitCode=0 Dec 03 14:58:45 crc kubenswrapper[4751]: I1203 14:58:45.531775 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn82" event={"ID":"063f705f-e754-44f2-9132-091b1438e3d1","Type":"ContainerDied","Data":"fd054ea483f718b65b8ff91a7936653ba084187e187cd307b8d4195357c51f94"} Dec 03 14:58:46 crc kubenswrapper[4751]: I1203 14:58:46.543372 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn82" event={"ID":"063f705f-e754-44f2-9132-091b1438e3d1","Type":"ContainerStarted","Data":"596bdf950b70311f074936d9ba03ca6c32666db00f72e50105df90ac552308e9"} Dec 03 14:58:46 crc kubenswrapper[4751]: I1203 14:58:46.564135 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xxn82" podStartSLOduration=3.113604901 podStartE2EDuration="5.564116375s" podCreationTimestamp="2025-12-03 14:58:41 +0000 UTC" firstStartedPulling="2025-12-03 14:58:43.501704301 +0000 UTC m=+2730.490059518" lastFinishedPulling="2025-12-03 14:58:45.952215775 +0000 UTC m=+2732.940570992" observedRunningTime="2025-12-03 14:58:46.563471378 +0000 UTC m=+2733.551826605" watchObservedRunningTime="2025-12-03 14:58:46.564116375 +0000 UTC m=+2733.552471602" Dec 03 14:58:50 crc kubenswrapper[4751]: I1203 14:58:50.591851 4751 generic.go:334] "Generic (PLEG): container finished" podID="607ac64e-604b-407d-9939-b8f2ba0832c5" containerID="d51758b8ef533e026fcf43ac76dbdf719d93b779254ec21132acf499942d7833" exitCode=0 Dec 03 14:58:50 crc kubenswrapper[4751]: I1203 14:58:50.592003 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" event={"ID":"607ac64e-604b-407d-9939-b8f2ba0832c5","Type":"ContainerDied","Data":"d51758b8ef533e026fcf43ac76dbdf719d93b779254ec21132acf499942d7833"} Dec 03 14:58:51 crc kubenswrapper[4751]: I1203 14:58:51.710157 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:51 crc kubenswrapper[4751]: I1203 14:58:51.710534 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:51 crc kubenswrapper[4751]: I1203 14:58:51.776054 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.193965 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.342892 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-combined-ca-bundle\") pod \"607ac64e-604b-407d-9939-b8f2ba0832c5\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.343013 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-secret-0\") pod \"607ac64e-604b-407d-9939-b8f2ba0832c5\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.343052 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-ssh-key\") pod \"607ac64e-604b-407d-9939-b8f2ba0832c5\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.343319 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-inventory\") pod \"607ac64e-604b-407d-9939-b8f2ba0832c5\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.343414 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjgt7\" (UniqueName: \"kubernetes.io/projected/607ac64e-604b-407d-9939-b8f2ba0832c5-kube-api-access-vjgt7\") pod \"607ac64e-604b-407d-9939-b8f2ba0832c5\" (UID: \"607ac64e-604b-407d-9939-b8f2ba0832c5\") " Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.348943 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "607ac64e-604b-407d-9939-b8f2ba0832c5" (UID: "607ac64e-604b-407d-9939-b8f2ba0832c5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.349044 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/607ac64e-604b-407d-9939-b8f2ba0832c5-kube-api-access-vjgt7" (OuterVolumeSpecName: "kube-api-access-vjgt7") pod "607ac64e-604b-407d-9939-b8f2ba0832c5" (UID: "607ac64e-604b-407d-9939-b8f2ba0832c5"). InnerVolumeSpecName "kube-api-access-vjgt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.372643 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "607ac64e-604b-407d-9939-b8f2ba0832c5" (UID: "607ac64e-604b-407d-9939-b8f2ba0832c5"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.381907 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-inventory" (OuterVolumeSpecName: "inventory") pod "607ac64e-604b-407d-9939-b8f2ba0832c5" (UID: "607ac64e-604b-407d-9939-b8f2ba0832c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.382909 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "607ac64e-604b-407d-9939-b8f2ba0832c5" (UID: "607ac64e-604b-407d-9939-b8f2ba0832c5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.447108 4751 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.447147 4751 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.447161 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.447173 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/607ac64e-604b-407d-9939-b8f2ba0832c5-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.447186 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjgt7\" (UniqueName: \"kubernetes.io/projected/607ac64e-604b-407d-9939-b8f2ba0832c5-kube-api-access-vjgt7\") on node \"crc\" DevicePath \"\"" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.620343 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" event={"ID":"607ac64e-604b-407d-9939-b8f2ba0832c5","Type":"ContainerDied","Data":"48680e94a3feeddc0462600d00e0ccd3a5cefd296b0d57edf0178c5e5e871b0d"} Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.620377 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.620398 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48680e94a3feeddc0462600d00e0ccd3a5cefd296b0d57edf0178c5e5e871b0d" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.673472 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.727250 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng"] Dec 03 14:58:52 crc kubenswrapper[4751]: E1203 14:58:52.728068 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607ac64e-604b-407d-9939-b8f2ba0832c5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.728084 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="607ac64e-604b-407d-9939-b8f2ba0832c5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.728276 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="607ac64e-604b-407d-9939-b8f2ba0832c5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.729268 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.735367 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.735432 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.735513 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.735539 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.735642 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.735715 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.742526 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.751079 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng"] Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.756561 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.756653 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.756712 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.756744 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftht\" (UniqueName: \"kubernetes.io/projected/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-kube-api-access-xftht\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.756783 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.756892 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.756945 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.757008 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.757302 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.759482 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xxn82"] Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.859279 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.859399 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.859527 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.859565 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.859620 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.859667 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.859698 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xftht\" (UniqueName: \"kubernetes.io/projected/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-kube-api-access-xftht\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.859732 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.859814 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.860802 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.863412 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.865170 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.867588 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.873553 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.878018 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.878237 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.878402 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:52 crc kubenswrapper[4751]: I1203 14:58:52.881110 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftht\" (UniqueName: \"kubernetes.io/projected/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-kube-api-access-xftht\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vgvng\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:53 crc kubenswrapper[4751]: I1203 14:58:53.060942 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 14:58:53 crc kubenswrapper[4751]: I1203 14:58:53.644221 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng"] Dec 03 14:58:54 crc kubenswrapper[4751]: I1203 14:58:54.650135 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" event={"ID":"a2deeaad-edf9-4d9c-b116-9a31587b1b2a","Type":"ContainerStarted","Data":"3832d60587069e6f111f15b8a84ba0b61f594a5979240c0fd9ca70941628d310"} Dec 03 14:58:54 crc kubenswrapper[4751]: I1203 14:58:54.650504 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" event={"ID":"a2deeaad-edf9-4d9c-b116-9a31587b1b2a","Type":"ContainerStarted","Data":"b4719582d29bd43ce626e235062d7699e23ff3c24a4cd4a3c2c178c5c76aa93d"} Dec 03 14:58:54 crc kubenswrapper[4751]: I1203 14:58:54.650363 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xxn82" podUID="063f705f-e754-44f2-9132-091b1438e3d1" containerName="registry-server" containerID="cri-o://596bdf950b70311f074936d9ba03ca6c32666db00f72e50105df90ac552308e9" gracePeriod=2 Dec 03 14:58:54 crc kubenswrapper[4751]: I1203 14:58:54.678627 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" podStartSLOduration=2.218605266 podStartE2EDuration="2.678606811s" podCreationTimestamp="2025-12-03 14:58:52 +0000 UTC" firstStartedPulling="2025-12-03 14:58:53.654535848 +0000 UTC m=+2740.642891075" lastFinishedPulling="2025-12-03 14:58:54.114537403 +0000 UTC m=+2741.102892620" observedRunningTime="2025-12-03 14:58:54.672832796 +0000 UTC m=+2741.661188023" watchObservedRunningTime="2025-12-03 14:58:54.678606811 +0000 UTC m=+2741.666962038" Dec 03 14:58:55 crc kubenswrapper[4751]: I1203 14:58:55.664285 4751 generic.go:334] "Generic (PLEG): container finished" podID="063f705f-e754-44f2-9132-091b1438e3d1" containerID="596bdf950b70311f074936d9ba03ca6c32666db00f72e50105df90ac552308e9" exitCode=0 Dec 03 14:58:55 crc kubenswrapper[4751]: I1203 14:58:55.664368 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn82" event={"ID":"063f705f-e754-44f2-9132-091b1438e3d1","Type":"ContainerDied","Data":"596bdf950b70311f074936d9ba03ca6c32666db00f72e50105df90ac552308e9"} Dec 03 14:58:55 crc kubenswrapper[4751]: I1203 14:58:55.868249 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:55 crc kubenswrapper[4751]: I1203 14:58:55.943923 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dctvh\" (UniqueName: \"kubernetes.io/projected/063f705f-e754-44f2-9132-091b1438e3d1-kube-api-access-dctvh\") pod \"063f705f-e754-44f2-9132-091b1438e3d1\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " Dec 03 14:58:55 crc kubenswrapper[4751]: I1203 14:58:55.944084 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-utilities\") pod \"063f705f-e754-44f2-9132-091b1438e3d1\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " Dec 03 14:58:55 crc kubenswrapper[4751]: I1203 14:58:55.944201 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-catalog-content\") pod \"063f705f-e754-44f2-9132-091b1438e3d1\" (UID: \"063f705f-e754-44f2-9132-091b1438e3d1\") " Dec 03 14:58:55 crc kubenswrapper[4751]: I1203 14:58:55.945034 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-utilities" (OuterVolumeSpecName: "utilities") pod "063f705f-e754-44f2-9132-091b1438e3d1" (UID: "063f705f-e754-44f2-9132-091b1438e3d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:58:55 crc kubenswrapper[4751]: I1203 14:58:55.953642 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/063f705f-e754-44f2-9132-091b1438e3d1-kube-api-access-dctvh" (OuterVolumeSpecName: "kube-api-access-dctvh") pod "063f705f-e754-44f2-9132-091b1438e3d1" (UID: "063f705f-e754-44f2-9132-091b1438e3d1"). InnerVolumeSpecName "kube-api-access-dctvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:58:55 crc kubenswrapper[4751]: I1203 14:58:55.997034 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "063f705f-e754-44f2-9132-091b1438e3d1" (UID: "063f705f-e754-44f2-9132-091b1438e3d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.045992 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.046341 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063f705f-e754-44f2-9132-091b1438e3d1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.046356 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dctvh\" (UniqueName: \"kubernetes.io/projected/063f705f-e754-44f2-9132-091b1438e3d1-kube-api-access-dctvh\") on node \"crc\" DevicePath \"\"" Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.678193 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn82" event={"ID":"063f705f-e754-44f2-9132-091b1438e3d1","Type":"ContainerDied","Data":"05addc3f7c6cf1f8cd8b3864317d41e301ce29a44a49b0daafe76513e0817728"} Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.678240 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxn82" Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.678250 4751 scope.go:117] "RemoveContainer" containerID="596bdf950b70311f074936d9ba03ca6c32666db00f72e50105df90ac552308e9" Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.718163 4751 scope.go:117] "RemoveContainer" containerID="fd054ea483f718b65b8ff91a7936653ba084187e187cd307b8d4195357c51f94" Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.720365 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xxn82"] Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.731314 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xxn82"] Dec 03 14:58:56 crc kubenswrapper[4751]: I1203 14:58:56.743267 4751 scope.go:117] "RemoveContainer" containerID="8947279a589233565b6f20e7f47065b0d0646b557850a6dacc348ed2fa7c850c" Dec 03 14:58:57 crc kubenswrapper[4751]: I1203 14:58:57.330369 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="063f705f-e754-44f2-9132-091b1438e3d1" path="/var/lib/kubelet/pods/063f705f-e754-44f2-9132-091b1438e3d1/volumes" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.384952 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7fk2c"] Dec 03 14:59:30 crc kubenswrapper[4751]: E1203 14:59:30.385934 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063f705f-e754-44f2-9132-091b1438e3d1" containerName="registry-server" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.385947 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="063f705f-e754-44f2-9132-091b1438e3d1" containerName="registry-server" Dec 03 14:59:30 crc kubenswrapper[4751]: E1203 14:59:30.385962 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063f705f-e754-44f2-9132-091b1438e3d1" containerName="extract-content" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.385968 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="063f705f-e754-44f2-9132-091b1438e3d1" containerName="extract-content" Dec 03 14:59:30 crc kubenswrapper[4751]: E1203 14:59:30.385990 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063f705f-e754-44f2-9132-091b1438e3d1" containerName="extract-utilities" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.385996 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="063f705f-e754-44f2-9132-091b1438e3d1" containerName="extract-utilities" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.386195 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="063f705f-e754-44f2-9132-091b1438e3d1" containerName="registry-server" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.421486 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fk2c"] Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.423162 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.449825 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk4gr\" (UniqueName: \"kubernetes.io/projected/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-kube-api-access-wk4gr\") pod \"redhat-operators-7fk2c\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.449883 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-utilities\") pod \"redhat-operators-7fk2c\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.450086 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-catalog-content\") pod \"redhat-operators-7fk2c\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.550724 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk4gr\" (UniqueName: \"kubernetes.io/projected/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-kube-api-access-wk4gr\") pod \"redhat-operators-7fk2c\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.550789 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-utilities\") pod \"redhat-operators-7fk2c\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.551484 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-utilities\") pod \"redhat-operators-7fk2c\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.551763 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-catalog-content\") pod \"redhat-operators-7fk2c\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.552078 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-catalog-content\") pod \"redhat-operators-7fk2c\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.575572 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk4gr\" (UniqueName: \"kubernetes.io/projected/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-kube-api-access-wk4gr\") pod \"redhat-operators-7fk2c\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:30 crc kubenswrapper[4751]: I1203 14:59:30.770888 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:31 crc kubenswrapper[4751]: I1203 14:59:31.235260 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fk2c"] Dec 03 14:59:32 crc kubenswrapper[4751]: I1203 14:59:32.085624 4751 generic.go:334] "Generic (PLEG): container finished" podID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerID="a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4" exitCode=0 Dec 03 14:59:32 crc kubenswrapper[4751]: I1203 14:59:32.085716 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fk2c" event={"ID":"ea00c8e0-ed65-4289-a4bc-544ce2a781f0","Type":"ContainerDied","Data":"a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4"} Dec 03 14:59:32 crc kubenswrapper[4751]: I1203 14:59:32.085944 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fk2c" event={"ID":"ea00c8e0-ed65-4289-a4bc-544ce2a781f0","Type":"ContainerStarted","Data":"7b258271d6666e3336c992845d6fa758a97dc6ac2ec9f33787e5f7a0a3082a32"} Dec 03 14:59:34 crc kubenswrapper[4751]: I1203 14:59:34.109301 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fk2c" event={"ID":"ea00c8e0-ed65-4289-a4bc-544ce2a781f0","Type":"ContainerStarted","Data":"351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154"} Dec 03 14:59:36 crc kubenswrapper[4751]: I1203 14:59:36.145382 4751 generic.go:334] "Generic (PLEG): container finished" podID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerID="351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154" exitCode=0 Dec 03 14:59:36 crc kubenswrapper[4751]: I1203 14:59:36.145459 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fk2c" event={"ID":"ea00c8e0-ed65-4289-a4bc-544ce2a781f0","Type":"ContainerDied","Data":"351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154"} Dec 03 14:59:37 crc kubenswrapper[4751]: I1203 14:59:37.165277 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fk2c" event={"ID":"ea00c8e0-ed65-4289-a4bc-544ce2a781f0","Type":"ContainerStarted","Data":"5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247"} Dec 03 14:59:37 crc kubenswrapper[4751]: I1203 14:59:37.196097 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7fk2c" podStartSLOduration=2.759359772 podStartE2EDuration="7.196064707s" podCreationTimestamp="2025-12-03 14:59:30 +0000 UTC" firstStartedPulling="2025-12-03 14:59:32.087869421 +0000 UTC m=+2779.076224638" lastFinishedPulling="2025-12-03 14:59:36.524574336 +0000 UTC m=+2783.512929573" observedRunningTime="2025-12-03 14:59:37.190165499 +0000 UTC m=+2784.178520736" watchObservedRunningTime="2025-12-03 14:59:37.196064707 +0000 UTC m=+2784.184419964" Dec 03 14:59:40 crc kubenswrapper[4751]: I1203 14:59:40.771803 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:40 crc kubenswrapper[4751]: I1203 14:59:40.772421 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:41 crc kubenswrapper[4751]: I1203 14:59:41.822225 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7fk2c" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerName="registry-server" probeResult="failure" output=< Dec 03 14:59:41 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 14:59:41 crc kubenswrapper[4751]: > Dec 03 14:59:50 crc kubenswrapper[4751]: I1203 14:59:50.854441 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:50 crc kubenswrapper[4751]: I1203 14:59:50.934449 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:51 crc kubenswrapper[4751]: I1203 14:59:51.099636 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fk2c"] Dec 03 14:59:52 crc kubenswrapper[4751]: I1203 14:59:52.315444 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7fk2c" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerName="registry-server" containerID="cri-o://5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247" gracePeriod=2 Dec 03 14:59:52 crc kubenswrapper[4751]: I1203 14:59:52.926545 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.003831 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-catalog-content\") pod \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.004321 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk4gr\" (UniqueName: \"kubernetes.io/projected/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-kube-api-access-wk4gr\") pod \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.004694 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-utilities\") pod \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\" (UID: \"ea00c8e0-ed65-4289-a4bc-544ce2a781f0\") " Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.006063 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-utilities" (OuterVolumeSpecName: "utilities") pod "ea00c8e0-ed65-4289-a4bc-544ce2a781f0" (UID: "ea00c8e0-ed65-4289-a4bc-544ce2a781f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.010567 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-kube-api-access-wk4gr" (OuterVolumeSpecName: "kube-api-access-wk4gr") pod "ea00c8e0-ed65-4289-a4bc-544ce2a781f0" (UID: "ea00c8e0-ed65-4289-a4bc-544ce2a781f0"). InnerVolumeSpecName "kube-api-access-wk4gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.108080 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.108118 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk4gr\" (UniqueName: \"kubernetes.io/projected/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-kube-api-access-wk4gr\") on node \"crc\" DevicePath \"\"" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.134625 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea00c8e0-ed65-4289-a4bc-544ce2a781f0" (UID: "ea00c8e0-ed65-4289-a4bc-544ce2a781f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.210484 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea00c8e0-ed65-4289-a4bc-544ce2a781f0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.326863 4751 generic.go:334] "Generic (PLEG): container finished" podID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerID="5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247" exitCode=0 Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.326967 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fk2c" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.331567 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fk2c" event={"ID":"ea00c8e0-ed65-4289-a4bc-544ce2a781f0","Type":"ContainerDied","Data":"5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247"} Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.331898 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fk2c" event={"ID":"ea00c8e0-ed65-4289-a4bc-544ce2a781f0","Type":"ContainerDied","Data":"7b258271d6666e3336c992845d6fa758a97dc6ac2ec9f33787e5f7a0a3082a32"} Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.332144 4751 scope.go:117] "RemoveContainer" containerID="5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.370293 4751 scope.go:117] "RemoveContainer" containerID="351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.375889 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fk2c"] Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.386432 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7fk2c"] Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.398888 4751 scope.go:117] "RemoveContainer" containerID="a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.467275 4751 scope.go:117] "RemoveContainer" containerID="5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247" Dec 03 14:59:53 crc kubenswrapper[4751]: E1203 14:59:53.467707 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247\": container with ID starting with 5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247 not found: ID does not exist" containerID="5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.467748 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247"} err="failed to get container status \"5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247\": rpc error: code = NotFound desc = could not find container \"5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247\": container with ID starting with 5df9ecba654207d38c35f61927433afeb8f274c92118c998367cba30466dd247 not found: ID does not exist" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.467778 4751 scope.go:117] "RemoveContainer" containerID="351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154" Dec 03 14:59:53 crc kubenswrapper[4751]: E1203 14:59:53.468277 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154\": container with ID starting with 351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154 not found: ID does not exist" containerID="351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.468319 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154"} err="failed to get container status \"351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154\": rpc error: code = NotFound desc = could not find container \"351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154\": container with ID starting with 351718b64abe489410574db6034d53adf6c71bafa1d2261a37c4b0ccd4de5154 not found: ID does not exist" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.468365 4751 scope.go:117] "RemoveContainer" containerID="a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4" Dec 03 14:59:53 crc kubenswrapper[4751]: E1203 14:59:53.468655 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4\": container with ID starting with a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4 not found: ID does not exist" containerID="a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4" Dec 03 14:59:53 crc kubenswrapper[4751]: I1203 14:59:53.468685 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4"} err="failed to get container status \"a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4\": rpc error: code = NotFound desc = could not find container \"a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4\": container with ID starting with a518b767dace011e13475dd4959b688f97c317dfe111e325bf7a6cfd639b5ff4 not found: ID does not exist" Dec 03 14:59:55 crc kubenswrapper[4751]: I1203 14:59:55.331200 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" path="/var/lib/kubelet/pods/ea00c8e0-ed65-4289-a4bc-544ce2a781f0/volumes" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.153458 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6"] Dec 03 15:00:00 crc kubenswrapper[4751]: E1203 15:00:00.154219 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerName="extract-content" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.154231 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerName="extract-content" Dec 03 15:00:00 crc kubenswrapper[4751]: E1203 15:00:00.154247 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerName="extract-utilities" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.154253 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerName="extract-utilities" Dec 03 15:00:00 crc kubenswrapper[4751]: E1203 15:00:00.154283 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerName="registry-server" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.154290 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerName="registry-server" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.154502 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea00c8e0-ed65-4289-a4bc-544ce2a781f0" containerName="registry-server" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.155185 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.157713 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.158932 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.215431 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6"] Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.266391 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc9232a8-20a7-4d36-ac50-f47e79b84215-secret-volume\") pod \"collect-profiles-29412900-7dtl6\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.266608 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp65h\" (UniqueName: \"kubernetes.io/projected/cc9232a8-20a7-4d36-ac50-f47e79b84215-kube-api-access-bp65h\") pod \"collect-profiles-29412900-7dtl6\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.267031 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc9232a8-20a7-4d36-ac50-f47e79b84215-config-volume\") pod \"collect-profiles-29412900-7dtl6\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.369513 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc9232a8-20a7-4d36-ac50-f47e79b84215-config-volume\") pod \"collect-profiles-29412900-7dtl6\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.369617 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc9232a8-20a7-4d36-ac50-f47e79b84215-secret-volume\") pod \"collect-profiles-29412900-7dtl6\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.370736 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc9232a8-20a7-4d36-ac50-f47e79b84215-config-volume\") pod \"collect-profiles-29412900-7dtl6\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.370914 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp65h\" (UniqueName: \"kubernetes.io/projected/cc9232a8-20a7-4d36-ac50-f47e79b84215-kube-api-access-bp65h\") pod \"collect-profiles-29412900-7dtl6\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.376189 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc9232a8-20a7-4d36-ac50-f47e79b84215-secret-volume\") pod \"collect-profiles-29412900-7dtl6\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.385715 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp65h\" (UniqueName: \"kubernetes.io/projected/cc9232a8-20a7-4d36-ac50-f47e79b84215-kube-api-access-bp65h\") pod \"collect-profiles-29412900-7dtl6\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.526000 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:00 crc kubenswrapper[4751]: I1203 15:00:00.952015 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6"] Dec 03 15:00:00 crc kubenswrapper[4751]: W1203 15:00:00.961634 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc9232a8_20a7_4d36_ac50_f47e79b84215.slice/crio-74f8718a82a444b0adb3207fd81a17e752e2b06b3e14d318a3caf2a65b60ef7f WatchSource:0}: Error finding container 74f8718a82a444b0adb3207fd81a17e752e2b06b3e14d318a3caf2a65b60ef7f: Status 404 returned error can't find the container with id 74f8718a82a444b0adb3207fd81a17e752e2b06b3e14d318a3caf2a65b60ef7f Dec 03 15:00:01 crc kubenswrapper[4751]: I1203 15:00:01.445792 4751 generic.go:334] "Generic (PLEG): container finished" podID="cc9232a8-20a7-4d36-ac50-f47e79b84215" containerID="48ef1d2b2c02193dc015f1278fc6ea7bd2403b93fe6b1f6febe39f72b6399d03" exitCode=0 Dec 03 15:00:01 crc kubenswrapper[4751]: I1203 15:00:01.446077 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" event={"ID":"cc9232a8-20a7-4d36-ac50-f47e79b84215","Type":"ContainerDied","Data":"48ef1d2b2c02193dc015f1278fc6ea7bd2403b93fe6b1f6febe39f72b6399d03"} Dec 03 15:00:01 crc kubenswrapper[4751]: I1203 15:00:01.446104 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" event={"ID":"cc9232a8-20a7-4d36-ac50-f47e79b84215","Type":"ContainerStarted","Data":"74f8718a82a444b0adb3207fd81a17e752e2b06b3e14d318a3caf2a65b60ef7f"} Dec 03 15:00:02 crc kubenswrapper[4751]: I1203 15:00:02.936842 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.135969 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp65h\" (UniqueName: \"kubernetes.io/projected/cc9232a8-20a7-4d36-ac50-f47e79b84215-kube-api-access-bp65h\") pod \"cc9232a8-20a7-4d36-ac50-f47e79b84215\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.136275 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc9232a8-20a7-4d36-ac50-f47e79b84215-secret-volume\") pod \"cc9232a8-20a7-4d36-ac50-f47e79b84215\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.136380 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc9232a8-20a7-4d36-ac50-f47e79b84215-config-volume\") pod \"cc9232a8-20a7-4d36-ac50-f47e79b84215\" (UID: \"cc9232a8-20a7-4d36-ac50-f47e79b84215\") " Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.136924 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc9232a8-20a7-4d36-ac50-f47e79b84215-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc9232a8-20a7-4d36-ac50-f47e79b84215" (UID: "cc9232a8-20a7-4d36-ac50-f47e79b84215"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.137480 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc9232a8-20a7-4d36-ac50-f47e79b84215-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.146486 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9232a8-20a7-4d36-ac50-f47e79b84215-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cc9232a8-20a7-4d36-ac50-f47e79b84215" (UID: "cc9232a8-20a7-4d36-ac50-f47e79b84215"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.147203 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9232a8-20a7-4d36-ac50-f47e79b84215-kube-api-access-bp65h" (OuterVolumeSpecName: "kube-api-access-bp65h") pod "cc9232a8-20a7-4d36-ac50-f47e79b84215" (UID: "cc9232a8-20a7-4d36-ac50-f47e79b84215"). InnerVolumeSpecName "kube-api-access-bp65h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.239565 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp65h\" (UniqueName: \"kubernetes.io/projected/cc9232a8-20a7-4d36-ac50-f47e79b84215-kube-api-access-bp65h\") on node \"crc\" DevicePath \"\"" Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.239603 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc9232a8-20a7-4d36-ac50-f47e79b84215-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.471865 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" event={"ID":"cc9232a8-20a7-4d36-ac50-f47e79b84215","Type":"ContainerDied","Data":"74f8718a82a444b0adb3207fd81a17e752e2b06b3e14d318a3caf2a65b60ef7f"} Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.471936 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74f8718a82a444b0adb3207fd81a17e752e2b06b3e14d318a3caf2a65b60ef7f" Dec 03 15:00:03 crc kubenswrapper[4751]: I1203 15:00:03.471952 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-7dtl6" Dec 03 15:00:04 crc kubenswrapper[4751]: I1203 15:00:04.019586 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g"] Dec 03 15:00:04 crc kubenswrapper[4751]: I1203 15:00:04.029680 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-vqk8g"] Dec 03 15:00:05 crc kubenswrapper[4751]: I1203 15:00:05.332035 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb81b718-8bc4-4c3e-9ec6-472c62d377a2" path="/var/lib/kubelet/pods/eb81b718-8bc4-4c3e-9ec6-472c62d377a2/volumes" Dec 03 15:00:28 crc kubenswrapper[4751]: I1203 15:00:28.850448 4751 scope.go:117] "RemoveContainer" containerID="390fd1508d98eae2ac14c39e28cc442f7ec0dcd6a68471b090edf84b59cfb904" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.161571 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412901-h5p7l"] Dec 03 15:01:00 crc kubenswrapper[4751]: E1203 15:01:00.162577 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9232a8-20a7-4d36-ac50-f47e79b84215" containerName="collect-profiles" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.162595 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9232a8-20a7-4d36-ac50-f47e79b84215" containerName="collect-profiles" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.162844 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9232a8-20a7-4d36-ac50-f47e79b84215" containerName="collect-profiles" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.163806 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.176244 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412901-h5p7l"] Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.255404 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-config-data\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.255740 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-fernet-keys\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.256020 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-combined-ca-bundle\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.256087 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9bkg\" (UniqueName: \"kubernetes.io/projected/bab120ba-b67e-46bf-9d23-359d3119b904-kube-api-access-f9bkg\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.357615 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-fernet-keys\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.357758 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-combined-ca-bundle\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.357830 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9bkg\" (UniqueName: \"kubernetes.io/projected/bab120ba-b67e-46bf-9d23-359d3119b904-kube-api-access-f9bkg\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.357896 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-config-data\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.364526 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-fernet-keys\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.364814 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-combined-ca-bundle\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.365194 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-config-data\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.378999 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9bkg\" (UniqueName: \"kubernetes.io/projected/bab120ba-b67e-46bf-9d23-359d3119b904-kube-api-access-f9bkg\") pod \"keystone-cron-29412901-h5p7l\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:00 crc kubenswrapper[4751]: I1203 15:01:00.486384 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:01 crc kubenswrapper[4751]: I1203 15:01:01.022875 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412901-h5p7l"] Dec 03 15:01:01 crc kubenswrapper[4751]: I1203 15:01:01.101531 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412901-h5p7l" event={"ID":"bab120ba-b67e-46bf-9d23-359d3119b904","Type":"ContainerStarted","Data":"f737916d8259c95a67db5acc70423db6e8cee960719b77c0247359a281c18135"} Dec 03 15:01:02 crc kubenswrapper[4751]: I1203 15:01:02.113202 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412901-h5p7l" event={"ID":"bab120ba-b67e-46bf-9d23-359d3119b904","Type":"ContainerStarted","Data":"9fa7bee075ff561691e6a75f24b481406fb15b5e7e2fa4be4312533afc4ad416"} Dec 03 15:01:02 crc kubenswrapper[4751]: I1203 15:01:02.143689 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412901-h5p7l" podStartSLOduration=2.143668188 podStartE2EDuration="2.143668188s" podCreationTimestamp="2025-12-03 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 15:01:02.135681716 +0000 UTC m=+2869.124036973" watchObservedRunningTime="2025-12-03 15:01:02.143668188 +0000 UTC m=+2869.132023405" Dec 03 15:01:04 crc kubenswrapper[4751]: I1203 15:01:04.138272 4751 generic.go:334] "Generic (PLEG): container finished" podID="bab120ba-b67e-46bf-9d23-359d3119b904" containerID="9fa7bee075ff561691e6a75f24b481406fb15b5e7e2fa4be4312533afc4ad416" exitCode=0 Dec 03 15:01:04 crc kubenswrapper[4751]: I1203 15:01:04.138360 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412901-h5p7l" event={"ID":"bab120ba-b67e-46bf-9d23-359d3119b904","Type":"ContainerDied","Data":"9fa7bee075ff561691e6a75f24b481406fb15b5e7e2fa4be4312533afc4ad416"} Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.556932 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.684861 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-combined-ca-bundle\") pod \"bab120ba-b67e-46bf-9d23-359d3119b904\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.684932 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9bkg\" (UniqueName: \"kubernetes.io/projected/bab120ba-b67e-46bf-9d23-359d3119b904-kube-api-access-f9bkg\") pod \"bab120ba-b67e-46bf-9d23-359d3119b904\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.685019 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-fernet-keys\") pod \"bab120ba-b67e-46bf-9d23-359d3119b904\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.685061 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-config-data\") pod \"bab120ba-b67e-46bf-9d23-359d3119b904\" (UID: \"bab120ba-b67e-46bf-9d23-359d3119b904\") " Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.695704 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bab120ba-b67e-46bf-9d23-359d3119b904" (UID: "bab120ba-b67e-46bf-9d23-359d3119b904"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.696929 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab120ba-b67e-46bf-9d23-359d3119b904-kube-api-access-f9bkg" (OuterVolumeSpecName: "kube-api-access-f9bkg") pod "bab120ba-b67e-46bf-9d23-359d3119b904" (UID: "bab120ba-b67e-46bf-9d23-359d3119b904"). InnerVolumeSpecName "kube-api-access-f9bkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.722676 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bab120ba-b67e-46bf-9d23-359d3119b904" (UID: "bab120ba-b67e-46bf-9d23-359d3119b904"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.740344 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-config-data" (OuterVolumeSpecName: "config-data") pod "bab120ba-b67e-46bf-9d23-359d3119b904" (UID: "bab120ba-b67e-46bf-9d23-359d3119b904"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.787852 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.788119 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9bkg\" (UniqueName: \"kubernetes.io/projected/bab120ba-b67e-46bf-9d23-359d3119b904-kube-api-access-f9bkg\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.788132 4751 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.788140 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab120ba-b67e-46bf-9d23-359d3119b904-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.820481 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:01:05 crc kubenswrapper[4751]: I1203 15:01:05.820805 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:01:06 crc kubenswrapper[4751]: I1203 15:01:06.177941 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412901-h5p7l" event={"ID":"bab120ba-b67e-46bf-9d23-359d3119b904","Type":"ContainerDied","Data":"f737916d8259c95a67db5acc70423db6e8cee960719b77c0247359a281c18135"} Dec 03 15:01:06 crc kubenswrapper[4751]: I1203 15:01:06.177989 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f737916d8259c95a67db5acc70423db6e8cee960719b77c0247359a281c18135" Dec 03 15:01:06 crc kubenswrapper[4751]: I1203 15:01:06.178066 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412901-h5p7l" Dec 03 15:01:35 crc kubenswrapper[4751]: I1203 15:01:35.819880 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:01:35 crc kubenswrapper[4751]: I1203 15:01:35.820403 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:01:49 crc kubenswrapper[4751]: I1203 15:01:49.586093 4751 generic.go:334] "Generic (PLEG): container finished" podID="a2deeaad-edf9-4d9c-b116-9a31587b1b2a" containerID="3832d60587069e6f111f15b8a84ba0b61f594a5979240c0fd9ca70941628d310" exitCode=0 Dec 03 15:01:49 crc kubenswrapper[4751]: I1203 15:01:49.586176 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" event={"ID":"a2deeaad-edf9-4d9c-b116-9a31587b1b2a","Type":"ContainerDied","Data":"3832d60587069e6f111f15b8a84ba0b61f594a5979240c0fd9ca70941628d310"} Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.059182 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.169656 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-ssh-key\") pod \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.169787 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-inventory\") pod \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.169838 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-1\") pod \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.169905 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-1\") pod \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.169961 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-0\") pod \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.170043 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-0\") pod \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.170134 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xftht\" (UniqueName: \"kubernetes.io/projected/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-kube-api-access-xftht\") pod \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.170206 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-extra-config-0\") pod \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.170240 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-combined-ca-bundle\") pod \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\" (UID: \"a2deeaad-edf9-4d9c-b116-9a31587b1b2a\") " Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.197851 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-kube-api-access-xftht" (OuterVolumeSpecName: "kube-api-access-xftht") pod "a2deeaad-edf9-4d9c-b116-9a31587b1b2a" (UID: "a2deeaad-edf9-4d9c-b116-9a31587b1b2a"). InnerVolumeSpecName "kube-api-access-xftht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.199530 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a2deeaad-edf9-4d9c-b116-9a31587b1b2a" (UID: "a2deeaad-edf9-4d9c-b116-9a31587b1b2a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.209805 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-inventory" (OuterVolumeSpecName: "inventory") pod "a2deeaad-edf9-4d9c-b116-9a31587b1b2a" (UID: "a2deeaad-edf9-4d9c-b116-9a31587b1b2a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.210577 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a2deeaad-edf9-4d9c-b116-9a31587b1b2a" (UID: "a2deeaad-edf9-4d9c-b116-9a31587b1b2a"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.216168 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a2deeaad-edf9-4d9c-b116-9a31587b1b2a" (UID: "a2deeaad-edf9-4d9c-b116-9a31587b1b2a"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.216527 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a2deeaad-edf9-4d9c-b116-9a31587b1b2a" (UID: "a2deeaad-edf9-4d9c-b116-9a31587b1b2a"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.223668 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a2deeaad-edf9-4d9c-b116-9a31587b1b2a" (UID: "a2deeaad-edf9-4d9c-b116-9a31587b1b2a"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.225874 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a2deeaad-edf9-4d9c-b116-9a31587b1b2a" (UID: "a2deeaad-edf9-4d9c-b116-9a31587b1b2a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.237824 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a2deeaad-edf9-4d9c-b116-9a31587b1b2a" (UID: "a2deeaad-edf9-4d9c-b116-9a31587b1b2a"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.273230 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.276230 4751 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.276807 4751 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.278143 4751 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.278216 4751 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.278278 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xftht\" (UniqueName: \"kubernetes.io/projected/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-kube-api-access-xftht\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.278358 4751 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.278428 4751 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.278494 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2deeaad-edf9-4d9c-b116-9a31587b1b2a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.616620 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" event={"ID":"a2deeaad-edf9-4d9c-b116-9a31587b1b2a","Type":"ContainerDied","Data":"b4719582d29bd43ce626e235062d7699e23ff3c24a4cd4a3c2c178c5c76aa93d"} Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.616665 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4719582d29bd43ce626e235062d7699e23ff3c24a4cd4a3c2c178c5c76aa93d" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.616734 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vgvng" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.709403 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l"] Dec 03 15:01:51 crc kubenswrapper[4751]: E1203 15:01:51.710065 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab120ba-b67e-46bf-9d23-359d3119b904" containerName="keystone-cron" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.710161 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab120ba-b67e-46bf-9d23-359d3119b904" containerName="keystone-cron" Dec 03 15:01:51 crc kubenswrapper[4751]: E1203 15:01:51.710251 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2deeaad-edf9-4d9c-b116-9a31587b1b2a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.710316 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2deeaad-edf9-4d9c-b116-9a31587b1b2a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.710660 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2deeaad-edf9-4d9c-b116-9a31587b1b2a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.710747 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab120ba-b67e-46bf-9d23-359d3119b904" containerName="keystone-cron" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.711495 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.713428 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.714228 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.714527 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.714572 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xf9cf" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.716195 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.720843 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l"] Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.789574 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.789657 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.789677 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7fq\" (UniqueName: \"kubernetes.io/projected/052552e4-436a-4f3e-a7cd-cacb72ff4f16-kube-api-access-ws7fq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.789702 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.789748 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.789790 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.789861 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.891812 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.891882 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.891932 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.891953 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7fq\" (UniqueName: \"kubernetes.io/projected/052552e4-436a-4f3e-a7cd-cacb72ff4f16-kube-api-access-ws7fq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.891975 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.892022 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.892061 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.895990 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.896181 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.896245 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.896604 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.897029 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.897128 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:51 crc kubenswrapper[4751]: I1203 15:01:51.912533 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7fq\" (UniqueName: \"kubernetes.io/projected/052552e4-436a-4f3e-a7cd-cacb72ff4f16-kube-api-access-ws7fq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:52 crc kubenswrapper[4751]: I1203 15:01:52.026984 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:01:52 crc kubenswrapper[4751]: I1203 15:01:52.664430 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l"] Dec 03 15:01:53 crc kubenswrapper[4751]: I1203 15:01:53.638894 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" event={"ID":"052552e4-436a-4f3e-a7cd-cacb72ff4f16","Type":"ContainerStarted","Data":"f1e3d0ba75f3a4ce1d4591f988e4c4f35cf1c7b7dc898426c24cea2c0657a748"} Dec 03 15:01:53 crc kubenswrapper[4751]: I1203 15:01:53.639534 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" event={"ID":"052552e4-436a-4f3e-a7cd-cacb72ff4f16","Type":"ContainerStarted","Data":"17edb9045f47b7aa1729796d7ca454284b81b5381710a8da3257398ba53b7de5"} Dec 03 15:01:53 crc kubenswrapper[4751]: I1203 15:01:53.663202 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" podStartSLOduration=2.235319436 podStartE2EDuration="2.663183117s" podCreationTimestamp="2025-12-03 15:01:51 +0000 UTC" firstStartedPulling="2025-12-03 15:01:52.656914007 +0000 UTC m=+2919.645269224" lastFinishedPulling="2025-12-03 15:01:53.084777698 +0000 UTC m=+2920.073132905" observedRunningTime="2025-12-03 15:01:53.653394208 +0000 UTC m=+2920.641749435" watchObservedRunningTime="2025-12-03 15:01:53.663183117 +0000 UTC m=+2920.651538334" Dec 03 15:02:05 crc kubenswrapper[4751]: I1203 15:02:05.819560 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:02:05 crc kubenswrapper[4751]: I1203 15:02:05.820167 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:02:05 crc kubenswrapper[4751]: I1203 15:02:05.820222 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 15:02:05 crc kubenswrapper[4751]: I1203 15:02:05.821106 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba0487b3f47045f8ec369ac1580a04dfd70d8b12f7a6308ce9834ccd20eec3be"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:02:05 crc kubenswrapper[4751]: I1203 15:02:05.821156 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://ba0487b3f47045f8ec369ac1580a04dfd70d8b12f7a6308ce9834ccd20eec3be" gracePeriod=600 Dec 03 15:02:06 crc kubenswrapper[4751]: I1203 15:02:06.785410 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="ba0487b3f47045f8ec369ac1580a04dfd70d8b12f7a6308ce9834ccd20eec3be" exitCode=0 Dec 03 15:02:06 crc kubenswrapper[4751]: I1203 15:02:06.785932 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"ba0487b3f47045f8ec369ac1580a04dfd70d8b12f7a6308ce9834ccd20eec3be"} Dec 03 15:02:06 crc kubenswrapper[4751]: I1203 15:02:06.785957 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3"} Dec 03 15:02:06 crc kubenswrapper[4751]: I1203 15:02:06.785974 4751 scope.go:117] "RemoveContainer" containerID="af5d011ca6687515dae519cbee879a97f83824f387f5bdb764d620efd030477e" Dec 03 15:04:19 crc kubenswrapper[4751]: I1203 15:04:19.394707 4751 generic.go:334] "Generic (PLEG): container finished" podID="052552e4-436a-4f3e-a7cd-cacb72ff4f16" containerID="f1e3d0ba75f3a4ce1d4591f988e4c4f35cf1c7b7dc898426c24cea2c0657a748" exitCode=0 Dec 03 15:04:19 crc kubenswrapper[4751]: I1203 15:04:19.394782 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" event={"ID":"052552e4-436a-4f3e-a7cd-cacb72ff4f16","Type":"ContainerDied","Data":"f1e3d0ba75f3a4ce1d4591f988e4c4f35cf1c7b7dc898426c24cea2c0657a748"} Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.037435 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.216775 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws7fq\" (UniqueName: \"kubernetes.io/projected/052552e4-436a-4f3e-a7cd-cacb72ff4f16-kube-api-access-ws7fq\") pod \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.217170 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-2\") pod \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.217196 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-1\") pod \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.217237 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ssh-key\") pod \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.217258 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-0\") pod \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.217349 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-inventory\") pod \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.217437 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-telemetry-combined-ca-bundle\") pod \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\" (UID: \"052552e4-436a-4f3e-a7cd-cacb72ff4f16\") " Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.223240 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/052552e4-436a-4f3e-a7cd-cacb72ff4f16-kube-api-access-ws7fq" (OuterVolumeSpecName: "kube-api-access-ws7fq") pod "052552e4-436a-4f3e-a7cd-cacb72ff4f16" (UID: "052552e4-436a-4f3e-a7cd-cacb72ff4f16"). InnerVolumeSpecName "kube-api-access-ws7fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.223411 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "052552e4-436a-4f3e-a7cd-cacb72ff4f16" (UID: "052552e4-436a-4f3e-a7cd-cacb72ff4f16"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.247738 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "052552e4-436a-4f3e-a7cd-cacb72ff4f16" (UID: "052552e4-436a-4f3e-a7cd-cacb72ff4f16"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.251361 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "052552e4-436a-4f3e-a7cd-cacb72ff4f16" (UID: "052552e4-436a-4f3e-a7cd-cacb72ff4f16"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.262805 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "052552e4-436a-4f3e-a7cd-cacb72ff4f16" (UID: "052552e4-436a-4f3e-a7cd-cacb72ff4f16"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.262920 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "052552e4-436a-4f3e-a7cd-cacb72ff4f16" (UID: "052552e4-436a-4f3e-a7cd-cacb72ff4f16"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.265838 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-inventory" (OuterVolumeSpecName: "inventory") pod "052552e4-436a-4f3e-a7cd-cacb72ff4f16" (UID: "052552e4-436a-4f3e-a7cd-cacb72ff4f16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.320619 4751 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.320666 4751 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.320678 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws7fq\" (UniqueName: \"kubernetes.io/projected/052552e4-436a-4f3e-a7cd-cacb72ff4f16-kube-api-access-ws7fq\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.320686 4751 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.320696 4751 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.320725 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.320734 4751 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/052552e4-436a-4f3e-a7cd-cacb72ff4f16-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.416132 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" event={"ID":"052552e4-436a-4f3e-a7cd-cacb72ff4f16","Type":"ContainerDied","Data":"17edb9045f47b7aa1729796d7ca454284b81b5381710a8da3257398ba53b7de5"} Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.416183 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17edb9045f47b7aa1729796d7ca454284b81b5381710a8da3257398ba53b7de5" Dec 03 15:04:21 crc kubenswrapper[4751]: I1203 15:04:21.416235 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l" Dec 03 15:04:36 crc kubenswrapper[4751]: I1203 15:04:35.819721 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:04:36 crc kubenswrapper[4751]: I1203 15:04:35.820458 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.723524 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 15:05:03 crc kubenswrapper[4751]: E1203 15:05:03.724636 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052552e4-436a-4f3e-a7cd-cacb72ff4f16" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.724659 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="052552e4-436a-4f3e-a7cd-cacb72ff4f16" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.724948 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="052552e4-436a-4f3e-a7cd-cacb72ff4f16" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.725918 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.729775 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.729884 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.730432 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vsb8b" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.730748 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.736121 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.749641 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.749710 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-config-data\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.749801 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.749831 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.749942 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.749971 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.750038 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.750356 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl65m\" (UniqueName: \"kubernetes.io/projected/aa32bbce-059c-46e3-a8d7-f737d93e394e-kube-api-access-gl65m\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.750477 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.853526 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.853594 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.854390 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.854664 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.855957 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.856091 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl65m\" (UniqueName: \"kubernetes.io/projected/aa32bbce-059c-46e3-a8d7-f737d93e394e-kube-api-access-gl65m\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.856386 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.856471 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.857290 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.857381 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-config-data\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.857545 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.857954 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.858466 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.859230 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-config-data\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.863872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.863957 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.864040 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.874669 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl65m\" (UniqueName: \"kubernetes.io/projected/aa32bbce-059c-46e3-a8d7-f737d93e394e-kube-api-access-gl65m\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:03 crc kubenswrapper[4751]: I1203 15:05:03.892769 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " pod="openstack/tempest-tests-tempest" Dec 03 15:05:04 crc kubenswrapper[4751]: I1203 15:05:04.058139 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 15:05:04 crc kubenswrapper[4751]: I1203 15:05:04.551555 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 15:05:04 crc kubenswrapper[4751]: I1203 15:05:04.566778 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 15:05:04 crc kubenswrapper[4751]: I1203 15:05:04.879300 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aa32bbce-059c-46e3-a8d7-f737d93e394e","Type":"ContainerStarted","Data":"ad016f93387d4cef09ffab642fa4a34ce3dc40f4d54d3e50d68db4b935093c23"} Dec 03 15:05:05 crc kubenswrapper[4751]: I1203 15:05:05.819946 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:05:05 crc kubenswrapper[4751]: I1203 15:05:05.820286 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:05:35 crc kubenswrapper[4751]: I1203 15:05:35.819855 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:05:35 crc kubenswrapper[4751]: I1203 15:05:35.820547 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:05:35 crc kubenswrapper[4751]: I1203 15:05:35.820628 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 15:05:35 crc kubenswrapper[4751]: I1203 15:05:35.821547 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:05:35 crc kubenswrapper[4751]: I1203 15:05:35.821606 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" gracePeriod=600 Dec 03 15:05:36 crc kubenswrapper[4751]: I1203 15:05:36.238291 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" exitCode=0 Dec 03 15:05:36 crc kubenswrapper[4751]: I1203 15:05:36.238367 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3"} Dec 03 15:05:36 crc kubenswrapper[4751]: I1203 15:05:36.238423 4751 scope.go:117] "RemoveContainer" containerID="ba0487b3f47045f8ec369ac1580a04dfd70d8b12f7a6308ce9834ccd20eec3be" Dec 03 15:05:38 crc kubenswrapper[4751]: E1203 15:05:38.542490 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:05:38 crc kubenswrapper[4751]: E1203 15:05:38.611232 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 03 15:05:38 crc kubenswrapper[4751]: E1203 15:05:38.611410 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gl65m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(aa32bbce-059c-46e3-a8d7-f737d93e394e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 15:05:38 crc kubenswrapper[4751]: E1203 15:05:38.612619 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="aa32bbce-059c-46e3-a8d7-f737d93e394e" Dec 03 15:05:39 crc kubenswrapper[4751]: I1203 15:05:39.283124 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:05:39 crc kubenswrapper[4751]: E1203 15:05:39.284191 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:05:39 crc kubenswrapper[4751]: E1203 15:05:39.286089 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="aa32bbce-059c-46e3-a8d7-f737d93e394e" Dec 03 15:05:51 crc kubenswrapper[4751]: I1203 15:05:51.313645 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:05:51 crc kubenswrapper[4751]: E1203 15:05:51.314423 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:05:54 crc kubenswrapper[4751]: I1203 15:05:54.802421 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 15:05:56 crc kubenswrapper[4751]: I1203 15:05:56.440233 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aa32bbce-059c-46e3-a8d7-f737d93e394e","Type":"ContainerStarted","Data":"ba875bd06540e505d8682a589a909ad6c1dbb922fa7628c68cd5aeb250776596"} Dec 03 15:05:56 crc kubenswrapper[4751]: I1203 15:05:56.468437 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.236130363 podStartE2EDuration="54.46841338s" podCreationTimestamp="2025-12-03 15:05:02 +0000 UTC" firstStartedPulling="2025-12-03 15:05:04.56655529 +0000 UTC m=+3111.554910507" lastFinishedPulling="2025-12-03 15:05:54.798838307 +0000 UTC m=+3161.787193524" observedRunningTime="2025-12-03 15:05:56.456568936 +0000 UTC m=+3163.444924163" watchObservedRunningTime="2025-12-03 15:05:56.46841338 +0000 UTC m=+3163.456768627" Dec 03 15:06:04 crc kubenswrapper[4751]: I1203 15:06:04.314567 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:06:04 crc kubenswrapper[4751]: E1203 15:06:04.315364 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:06:17 crc kubenswrapper[4751]: I1203 15:06:17.316778 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:06:17 crc kubenswrapper[4751]: E1203 15:06:17.318012 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:06:28 crc kubenswrapper[4751]: I1203 15:06:28.315835 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:06:28 crc kubenswrapper[4751]: E1203 15:06:28.316642 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:06:42 crc kubenswrapper[4751]: I1203 15:06:42.315084 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:06:42 crc kubenswrapper[4751]: E1203 15:06:42.315833 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.704179 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pfpxk"] Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.708712 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.714292 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfpxk"] Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.815983 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqld6\" (UniqueName: \"kubernetes.io/projected/df1f26eb-de03-4959-8b97-89d635b44b0d-kube-api-access-tqld6\") pod \"redhat-marketplace-pfpxk\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.816692 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-catalog-content\") pod \"redhat-marketplace-pfpxk\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.817036 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-utilities\") pod \"redhat-marketplace-pfpxk\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.920130 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-catalog-content\") pod \"redhat-marketplace-pfpxk\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.920519 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-utilities\") pod \"redhat-marketplace-pfpxk\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.920791 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqld6\" (UniqueName: \"kubernetes.io/projected/df1f26eb-de03-4959-8b97-89d635b44b0d-kube-api-access-tqld6\") pod \"redhat-marketplace-pfpxk\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.920975 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-catalog-content\") pod \"redhat-marketplace-pfpxk\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.921145 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-utilities\") pod \"redhat-marketplace-pfpxk\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:48 crc kubenswrapper[4751]: I1203 15:06:48.947408 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqld6\" (UniqueName: \"kubernetes.io/projected/df1f26eb-de03-4959-8b97-89d635b44b0d-kube-api-access-tqld6\") pod \"redhat-marketplace-pfpxk\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:49 crc kubenswrapper[4751]: I1203 15:06:49.065309 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:49 crc kubenswrapper[4751]: I1203 15:06:49.623689 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfpxk"] Dec 03 15:06:49 crc kubenswrapper[4751]: W1203 15:06:49.626406 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf1f26eb_de03_4959_8b97_89d635b44b0d.slice/crio-77983ba4b8f84935a4cd3074b66e0a1654e49a73f19c21f8f87cda940818f36d WatchSource:0}: Error finding container 77983ba4b8f84935a4cd3074b66e0a1654e49a73f19c21f8f87cda940818f36d: Status 404 returned error can't find the container with id 77983ba4b8f84935a4cd3074b66e0a1654e49a73f19c21f8f87cda940818f36d Dec 03 15:06:50 crc kubenswrapper[4751]: I1203 15:06:50.042722 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfpxk" event={"ID":"df1f26eb-de03-4959-8b97-89d635b44b0d","Type":"ContainerStarted","Data":"77983ba4b8f84935a4cd3074b66e0a1654e49a73f19c21f8f87cda940818f36d"} Dec 03 15:06:51 crc kubenswrapper[4751]: I1203 15:06:51.053162 4751 generic.go:334] "Generic (PLEG): container finished" podID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerID="0313a50b4fdbc9108a63dcdde2adab6c94395b95697a5474655bb87072af606f" exitCode=0 Dec 03 15:06:51 crc kubenswrapper[4751]: I1203 15:06:51.053249 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfpxk" event={"ID":"df1f26eb-de03-4959-8b97-89d635b44b0d","Type":"ContainerDied","Data":"0313a50b4fdbc9108a63dcdde2adab6c94395b95697a5474655bb87072af606f"} Dec 03 15:06:53 crc kubenswrapper[4751]: I1203 15:06:53.075616 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfpxk" event={"ID":"df1f26eb-de03-4959-8b97-89d635b44b0d","Type":"ContainerStarted","Data":"d6e5912ef5388ee43fe1dc066ca746d7c0c3a3b8df4409d5dafc244426ea0617"} Dec 03 15:06:54 crc kubenswrapper[4751]: I1203 15:06:54.088062 4751 generic.go:334] "Generic (PLEG): container finished" podID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerID="d6e5912ef5388ee43fe1dc066ca746d7c0c3a3b8df4409d5dafc244426ea0617" exitCode=0 Dec 03 15:06:54 crc kubenswrapper[4751]: I1203 15:06:54.088108 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfpxk" event={"ID":"df1f26eb-de03-4959-8b97-89d635b44b0d","Type":"ContainerDied","Data":"d6e5912ef5388ee43fe1dc066ca746d7c0c3a3b8df4409d5dafc244426ea0617"} Dec 03 15:06:55 crc kubenswrapper[4751]: I1203 15:06:55.314537 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:06:55 crc kubenswrapper[4751]: E1203 15:06:55.315288 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:06:57 crc kubenswrapper[4751]: I1203 15:06:57.118985 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfpxk" event={"ID":"df1f26eb-de03-4959-8b97-89d635b44b0d","Type":"ContainerStarted","Data":"4523b4af03310e8964acc9be1eb22aa2bcd71b51b5d03e1771acceae3a81a17b"} Dec 03 15:06:57 crc kubenswrapper[4751]: I1203 15:06:57.140122 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pfpxk" podStartSLOduration=4.113986341 podStartE2EDuration="9.140107905s" podCreationTimestamp="2025-12-03 15:06:48 +0000 UTC" firstStartedPulling="2025-12-03 15:06:51.054705954 +0000 UTC m=+3218.043061171" lastFinishedPulling="2025-12-03 15:06:56.080827518 +0000 UTC m=+3223.069182735" observedRunningTime="2025-12-03 15:06:57.138480652 +0000 UTC m=+3224.126835889" watchObservedRunningTime="2025-12-03 15:06:57.140107905 +0000 UTC m=+3224.128463122" Dec 03 15:06:59 crc kubenswrapper[4751]: I1203 15:06:59.066441 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:59 crc kubenswrapper[4751]: I1203 15:06:59.066838 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:06:59 crc kubenswrapper[4751]: I1203 15:06:59.116587 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:07:06 crc kubenswrapper[4751]: I1203 15:07:06.314364 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:07:06 crc kubenswrapper[4751]: E1203 15:07:06.315093 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:07:09 crc kubenswrapper[4751]: I1203 15:07:09.123302 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:07:09 crc kubenswrapper[4751]: I1203 15:07:09.182986 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfpxk"] Dec 03 15:07:09 crc kubenswrapper[4751]: I1203 15:07:09.237525 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pfpxk" podUID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerName="registry-server" containerID="cri-o://4523b4af03310e8964acc9be1eb22aa2bcd71b51b5d03e1771acceae3a81a17b" gracePeriod=2 Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.248798 4751 generic.go:334] "Generic (PLEG): container finished" podID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerID="4523b4af03310e8964acc9be1eb22aa2bcd71b51b5d03e1771acceae3a81a17b" exitCode=0 Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.248942 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfpxk" event={"ID":"df1f26eb-de03-4959-8b97-89d635b44b0d","Type":"ContainerDied","Data":"4523b4af03310e8964acc9be1eb22aa2bcd71b51b5d03e1771acceae3a81a17b"} Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.389583 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.493300 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-catalog-content\") pod \"df1f26eb-de03-4959-8b97-89d635b44b0d\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.494529 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqld6\" (UniqueName: \"kubernetes.io/projected/df1f26eb-de03-4959-8b97-89d635b44b0d-kube-api-access-tqld6\") pod \"df1f26eb-de03-4959-8b97-89d635b44b0d\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.494675 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-utilities\") pod \"df1f26eb-de03-4959-8b97-89d635b44b0d\" (UID: \"df1f26eb-de03-4959-8b97-89d635b44b0d\") " Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.495184 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-utilities" (OuterVolumeSpecName: "utilities") pod "df1f26eb-de03-4959-8b97-89d635b44b0d" (UID: "df1f26eb-de03-4959-8b97-89d635b44b0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.495694 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.505631 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df1f26eb-de03-4959-8b97-89d635b44b0d-kube-api-access-tqld6" (OuterVolumeSpecName: "kube-api-access-tqld6") pod "df1f26eb-de03-4959-8b97-89d635b44b0d" (UID: "df1f26eb-de03-4959-8b97-89d635b44b0d"). InnerVolumeSpecName "kube-api-access-tqld6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.511568 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df1f26eb-de03-4959-8b97-89d635b44b0d" (UID: "df1f26eb-de03-4959-8b97-89d635b44b0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.597526 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqld6\" (UniqueName: \"kubernetes.io/projected/df1f26eb-de03-4959-8b97-89d635b44b0d-kube-api-access-tqld6\") on node \"crc\" DevicePath \"\"" Dec 03 15:07:10 crc kubenswrapper[4751]: I1203 15:07:10.597583 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df1f26eb-de03-4959-8b97-89d635b44b0d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:07:11 crc kubenswrapper[4751]: I1203 15:07:11.265511 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfpxk" event={"ID":"df1f26eb-de03-4959-8b97-89d635b44b0d","Type":"ContainerDied","Data":"77983ba4b8f84935a4cd3074b66e0a1654e49a73f19c21f8f87cda940818f36d"} Dec 03 15:07:11 crc kubenswrapper[4751]: I1203 15:07:11.265575 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfpxk" Dec 03 15:07:11 crc kubenswrapper[4751]: I1203 15:07:11.265610 4751 scope.go:117] "RemoveContainer" containerID="4523b4af03310e8964acc9be1eb22aa2bcd71b51b5d03e1771acceae3a81a17b" Dec 03 15:07:11 crc kubenswrapper[4751]: I1203 15:07:11.301048 4751 scope.go:117] "RemoveContainer" containerID="d6e5912ef5388ee43fe1dc066ca746d7c0c3a3b8df4409d5dafc244426ea0617" Dec 03 15:07:11 crc kubenswrapper[4751]: I1203 15:07:11.321273 4751 scope.go:117] "RemoveContainer" containerID="0313a50b4fdbc9108a63dcdde2adab6c94395b95697a5474655bb87072af606f" Dec 03 15:07:11 crc kubenswrapper[4751]: I1203 15:07:11.347551 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfpxk"] Dec 03 15:07:11 crc kubenswrapper[4751]: I1203 15:07:11.349182 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfpxk"] Dec 03 15:07:13 crc kubenswrapper[4751]: I1203 15:07:13.326955 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df1f26eb-de03-4959-8b97-89d635b44b0d" path="/var/lib/kubelet/pods/df1f26eb-de03-4959-8b97-89d635b44b0d/volumes" Dec 03 15:07:17 crc kubenswrapper[4751]: I1203 15:07:17.314445 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:07:17 crc kubenswrapper[4751]: E1203 15:07:17.315215 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:07:30 crc kubenswrapper[4751]: I1203 15:07:30.314307 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:07:30 crc kubenswrapper[4751]: E1203 15:07:30.315425 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.793204 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-66rjf"] Dec 03 15:07:31 crc kubenswrapper[4751]: E1203 15:07:31.793698 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerName="extract-content" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.793711 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerName="extract-content" Dec 03 15:07:31 crc kubenswrapper[4751]: E1203 15:07:31.793731 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerName="registry-server" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.793737 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerName="registry-server" Dec 03 15:07:31 crc kubenswrapper[4751]: E1203 15:07:31.793750 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerName="extract-utilities" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.793757 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerName="extract-utilities" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.793984 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="df1f26eb-de03-4959-8b97-89d635b44b0d" containerName="registry-server" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.795556 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.819345 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66rjf"] Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.866145 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckbtj\" (UniqueName: \"kubernetes.io/projected/cc8899a3-1760-4813-8295-dbcfdfb8a113-kube-api-access-ckbtj\") pod \"certified-operators-66rjf\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.866537 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-utilities\") pod \"certified-operators-66rjf\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.866872 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-catalog-content\") pod \"certified-operators-66rjf\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.969635 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-catalog-content\") pod \"certified-operators-66rjf\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.969771 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckbtj\" (UniqueName: \"kubernetes.io/projected/cc8899a3-1760-4813-8295-dbcfdfb8a113-kube-api-access-ckbtj\") pod \"certified-operators-66rjf\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.969797 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-utilities\") pod \"certified-operators-66rjf\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.970217 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-catalog-content\") pod \"certified-operators-66rjf\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.970645 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-utilities\") pod \"certified-operators-66rjf\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:31 crc kubenswrapper[4751]: I1203 15:07:31.989225 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckbtj\" (UniqueName: \"kubernetes.io/projected/cc8899a3-1760-4813-8295-dbcfdfb8a113-kube-api-access-ckbtj\") pod \"certified-operators-66rjf\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:32 crc kubenswrapper[4751]: I1203 15:07:32.155647 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:32 crc kubenswrapper[4751]: I1203 15:07:32.826923 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66rjf"] Dec 03 15:07:33 crc kubenswrapper[4751]: I1203 15:07:33.519203 4751 generic.go:334] "Generic (PLEG): container finished" podID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerID="78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a" exitCode=0 Dec 03 15:07:33 crc kubenswrapper[4751]: I1203 15:07:33.519245 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rjf" event={"ID":"cc8899a3-1760-4813-8295-dbcfdfb8a113","Type":"ContainerDied","Data":"78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a"} Dec 03 15:07:33 crc kubenswrapper[4751]: I1203 15:07:33.519270 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rjf" event={"ID":"cc8899a3-1760-4813-8295-dbcfdfb8a113","Type":"ContainerStarted","Data":"c4eb521c5b77ff6d961f4b23aac6ddd4701f0c3e1dffddf0952eadacd486e0c7"} Dec 03 15:07:35 crc kubenswrapper[4751]: I1203 15:07:35.556674 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rjf" event={"ID":"cc8899a3-1760-4813-8295-dbcfdfb8a113","Type":"ContainerStarted","Data":"54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275"} Dec 03 15:07:36 crc kubenswrapper[4751]: I1203 15:07:36.567608 4751 generic.go:334] "Generic (PLEG): container finished" podID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerID="54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275" exitCode=0 Dec 03 15:07:36 crc kubenswrapper[4751]: I1203 15:07:36.567676 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rjf" event={"ID":"cc8899a3-1760-4813-8295-dbcfdfb8a113","Type":"ContainerDied","Data":"54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275"} Dec 03 15:07:37 crc kubenswrapper[4751]: I1203 15:07:37.580478 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rjf" event={"ID":"cc8899a3-1760-4813-8295-dbcfdfb8a113","Type":"ContainerStarted","Data":"e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4"} Dec 03 15:07:37 crc kubenswrapper[4751]: I1203 15:07:37.605704 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-66rjf" podStartSLOduration=3.113500105 podStartE2EDuration="6.605679737s" podCreationTimestamp="2025-12-03 15:07:31 +0000 UTC" firstStartedPulling="2025-12-03 15:07:33.521267359 +0000 UTC m=+3260.509622576" lastFinishedPulling="2025-12-03 15:07:37.013446981 +0000 UTC m=+3264.001802208" observedRunningTime="2025-12-03 15:07:37.598434105 +0000 UTC m=+3264.586789332" watchObservedRunningTime="2025-12-03 15:07:37.605679737 +0000 UTC m=+3264.594034954" Dec 03 15:07:42 crc kubenswrapper[4751]: I1203 15:07:42.156267 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:42 crc kubenswrapper[4751]: I1203 15:07:42.158145 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:42 crc kubenswrapper[4751]: I1203 15:07:42.213984 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:42 crc kubenswrapper[4751]: I1203 15:07:42.314960 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:07:42 crc kubenswrapper[4751]: E1203 15:07:42.315324 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:07:42 crc kubenswrapper[4751]: I1203 15:07:42.697241 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:42 crc kubenswrapper[4751]: I1203 15:07:42.764167 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66rjf"] Dec 03 15:07:44 crc kubenswrapper[4751]: I1203 15:07:44.658449 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-66rjf" podUID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerName="registry-server" containerID="cri-o://e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4" gracePeriod=2 Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.278042 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.349186 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-utilities\") pod \"cc8899a3-1760-4813-8295-dbcfdfb8a113\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.349472 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckbtj\" (UniqueName: \"kubernetes.io/projected/cc8899a3-1760-4813-8295-dbcfdfb8a113-kube-api-access-ckbtj\") pod \"cc8899a3-1760-4813-8295-dbcfdfb8a113\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.349581 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-catalog-content\") pod \"cc8899a3-1760-4813-8295-dbcfdfb8a113\" (UID: \"cc8899a3-1760-4813-8295-dbcfdfb8a113\") " Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.350979 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-utilities" (OuterVolumeSpecName: "utilities") pod "cc8899a3-1760-4813-8295-dbcfdfb8a113" (UID: "cc8899a3-1760-4813-8295-dbcfdfb8a113"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.366117 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc8899a3-1760-4813-8295-dbcfdfb8a113-kube-api-access-ckbtj" (OuterVolumeSpecName: "kube-api-access-ckbtj") pod "cc8899a3-1760-4813-8295-dbcfdfb8a113" (UID: "cc8899a3-1760-4813-8295-dbcfdfb8a113"). InnerVolumeSpecName "kube-api-access-ckbtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.411093 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc8899a3-1760-4813-8295-dbcfdfb8a113" (UID: "cc8899a3-1760-4813-8295-dbcfdfb8a113"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.452305 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.452350 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8899a3-1760-4813-8295-dbcfdfb8a113-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.452360 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckbtj\" (UniqueName: \"kubernetes.io/projected/cc8899a3-1760-4813-8295-dbcfdfb8a113-kube-api-access-ckbtj\") on node \"crc\" DevicePath \"\"" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.668236 4751 generic.go:334] "Generic (PLEG): container finished" podID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerID="e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4" exitCode=0 Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.668273 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rjf" event={"ID":"cc8899a3-1760-4813-8295-dbcfdfb8a113","Type":"ContainerDied","Data":"e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4"} Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.668285 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66rjf" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.668298 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rjf" event={"ID":"cc8899a3-1760-4813-8295-dbcfdfb8a113","Type":"ContainerDied","Data":"c4eb521c5b77ff6d961f4b23aac6ddd4701f0c3e1dffddf0952eadacd486e0c7"} Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.668345 4751 scope.go:117] "RemoveContainer" containerID="e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.697932 4751 scope.go:117] "RemoveContainer" containerID="54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.701032 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66rjf"] Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.710249 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-66rjf"] Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.722166 4751 scope.go:117] "RemoveContainer" containerID="78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.775061 4751 scope.go:117] "RemoveContainer" containerID="e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4" Dec 03 15:07:45 crc kubenswrapper[4751]: E1203 15:07:45.775504 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4\": container with ID starting with e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4 not found: ID does not exist" containerID="e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.775550 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4"} err="failed to get container status \"e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4\": rpc error: code = NotFound desc = could not find container \"e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4\": container with ID starting with e8f43fac1d8b07b17f9d596df05c61e24d960807d222bae915cfb0d03179c6c4 not found: ID does not exist" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.775587 4751 scope.go:117] "RemoveContainer" containerID="54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275" Dec 03 15:07:45 crc kubenswrapper[4751]: E1203 15:07:45.775892 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275\": container with ID starting with 54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275 not found: ID does not exist" containerID="54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.775924 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275"} err="failed to get container status \"54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275\": rpc error: code = NotFound desc = could not find container \"54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275\": container with ID starting with 54b3e86250486a9dee0b61703b77a16862ce91c6cdfad33dcca01c4b83738275 not found: ID does not exist" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.775952 4751 scope.go:117] "RemoveContainer" containerID="78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a" Dec 03 15:07:45 crc kubenswrapper[4751]: E1203 15:07:45.776198 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a\": container with ID starting with 78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a not found: ID does not exist" containerID="78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a" Dec 03 15:07:45 crc kubenswrapper[4751]: I1203 15:07:45.776219 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a"} err="failed to get container status \"78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a\": rpc error: code = NotFound desc = could not find container \"78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a\": container with ID starting with 78844ab5a022ee84286850775ed94985a3ccf0ae9027bfecdb12aa1ebed5045a not found: ID does not exist" Dec 03 15:07:47 crc kubenswrapper[4751]: I1203 15:07:47.326012 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc8899a3-1760-4813-8295-dbcfdfb8a113" path="/var/lib/kubelet/pods/cc8899a3-1760-4813-8295-dbcfdfb8a113/volumes" Dec 03 15:07:53 crc kubenswrapper[4751]: I1203 15:07:53.321410 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:07:53 crc kubenswrapper[4751]: E1203 15:07:53.324810 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:08:07 crc kubenswrapper[4751]: I1203 15:08:07.313905 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:08:07 crc kubenswrapper[4751]: E1203 15:08:07.314611 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:08:20 crc kubenswrapper[4751]: I1203 15:08:20.314214 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:08:20 crc kubenswrapper[4751]: E1203 15:08:20.314917 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:08:33 crc kubenswrapper[4751]: I1203 15:08:33.320197 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:08:33 crc kubenswrapper[4751]: E1203 15:08:33.320966 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:08:44 crc kubenswrapper[4751]: I1203 15:08:44.313905 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:08:44 crc kubenswrapper[4751]: E1203 15:08:44.314575 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.778598 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-grmct"] Dec 03 15:08:52 crc kubenswrapper[4751]: E1203 15:08:52.779615 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerName="extract-content" Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.779633 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerName="extract-content" Dec 03 15:08:52 crc kubenswrapper[4751]: E1203 15:08:52.779647 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerName="registry-server" Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.779653 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerName="registry-server" Dec 03 15:08:52 crc kubenswrapper[4751]: E1203 15:08:52.779673 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerName="extract-utilities" Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.779680 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerName="extract-utilities" Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.779906 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc8899a3-1760-4813-8295-dbcfdfb8a113" containerName="registry-server" Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.787493 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.802129 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-grmct"] Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.940684 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d559m\" (UniqueName: \"kubernetes.io/projected/012cc9a0-f727-4f96-ac42-6b1f8c69236a-kube-api-access-d559m\") pod \"community-operators-grmct\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.940737 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-utilities\") pod \"community-operators-grmct\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:52 crc kubenswrapper[4751]: I1203 15:08:52.940897 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-catalog-content\") pod \"community-operators-grmct\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:53 crc kubenswrapper[4751]: I1203 15:08:53.042899 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-catalog-content\") pod \"community-operators-grmct\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:53 crc kubenswrapper[4751]: I1203 15:08:53.043149 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d559m\" (UniqueName: \"kubernetes.io/projected/012cc9a0-f727-4f96-ac42-6b1f8c69236a-kube-api-access-d559m\") pod \"community-operators-grmct\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:53 crc kubenswrapper[4751]: I1203 15:08:53.043183 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-utilities\") pod \"community-operators-grmct\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:53 crc kubenswrapper[4751]: I1203 15:08:53.043409 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-catalog-content\") pod \"community-operators-grmct\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:53 crc kubenswrapper[4751]: I1203 15:08:53.043632 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-utilities\") pod \"community-operators-grmct\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:53 crc kubenswrapper[4751]: I1203 15:08:53.063071 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d559m\" (UniqueName: \"kubernetes.io/projected/012cc9a0-f727-4f96-ac42-6b1f8c69236a-kube-api-access-d559m\") pod \"community-operators-grmct\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:53 crc kubenswrapper[4751]: I1203 15:08:53.113090 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-grmct" Dec 03 15:08:53 crc kubenswrapper[4751]: I1203 15:08:53.691606 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-grmct"] Dec 03 15:08:54 crc kubenswrapper[4751]: I1203 15:08:54.393057 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grmct" event={"ID":"012cc9a0-f727-4f96-ac42-6b1f8c69236a","Type":"ContainerStarted","Data":"7ebbc4a13e106eac890e712c870da9f40d23cb41b8de9a11b8177831ab1657e2"} Dec 03 15:08:55 crc kubenswrapper[4751]: I1203 15:08:55.314168 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:08:55 crc kubenswrapper[4751]: E1203 15:08:55.314722 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:08:55 crc kubenswrapper[4751]: I1203 15:08:55.403961 4751 generic.go:334] "Generic (PLEG): container finished" podID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerID="34eabaabd3071751fde1bc9f8e4fd39b37b06a2ff03e64b9793f54eddf5ce232" exitCode=0 Dec 03 15:08:55 crc kubenswrapper[4751]: I1203 15:08:55.404029 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grmct" event={"ID":"012cc9a0-f727-4f96-ac42-6b1f8c69236a","Type":"ContainerDied","Data":"34eabaabd3071751fde1bc9f8e4fd39b37b06a2ff03e64b9793f54eddf5ce232"} Dec 03 15:08:57 crc kubenswrapper[4751]: I1203 15:08:57.425236 4751 generic.go:334] "Generic (PLEG): container finished" podID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerID="2dd4a275cc76f50775338d8710573ef6cd111ffb70911e56cf591622c2e42c0a" exitCode=0 Dec 03 15:08:57 crc kubenswrapper[4751]: I1203 15:08:57.425310 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grmct" event={"ID":"012cc9a0-f727-4f96-ac42-6b1f8c69236a","Type":"ContainerDied","Data":"2dd4a275cc76f50775338d8710573ef6cd111ffb70911e56cf591622c2e42c0a"} Dec 03 15:08:58 crc kubenswrapper[4751]: I1203 15:08:58.438884 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grmct" event={"ID":"012cc9a0-f727-4f96-ac42-6b1f8c69236a","Type":"ContainerStarted","Data":"9f77aa8786e769804f739bd3883dc8c7e9440e71dd36253fa972987911c78f30"} Dec 03 15:08:58 crc kubenswrapper[4751]: I1203 15:08:58.460021 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-grmct" podStartSLOduration=3.972583687 podStartE2EDuration="6.460001589s" podCreationTimestamp="2025-12-03 15:08:52 +0000 UTC" firstStartedPulling="2025-12-03 15:08:55.406014343 +0000 UTC m=+3342.394369550" lastFinishedPulling="2025-12-03 15:08:57.893432235 +0000 UTC m=+3344.881787452" observedRunningTime="2025-12-03 15:08:58.454278837 +0000 UTC m=+3345.442634054" watchObservedRunningTime="2025-12-03 15:08:58.460001589 +0000 UTC m=+3345.448356806" Dec 03 15:09:03 crc kubenswrapper[4751]: I1203 15:09:03.114319 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-grmct" Dec 03 15:09:03 crc kubenswrapper[4751]: I1203 15:09:03.115318 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-grmct" Dec 03 15:09:03 crc kubenswrapper[4751]: I1203 15:09:03.164360 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-grmct" Dec 03 15:09:03 crc kubenswrapper[4751]: I1203 15:09:03.536178 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-grmct" Dec 03 15:09:03 crc kubenswrapper[4751]: I1203 15:09:03.583245 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-grmct"] Dec 03 15:09:05 crc kubenswrapper[4751]: I1203 15:09:05.513034 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-grmct" podUID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerName="registry-server" containerID="cri-o://9f77aa8786e769804f739bd3883dc8c7e9440e71dd36253fa972987911c78f30" gracePeriod=2 Dec 03 15:09:09 crc kubenswrapper[4751]: I1203 15:09:09.563312 4751 generic.go:334] "Generic (PLEG): container finished" podID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerID="9f77aa8786e769804f739bd3883dc8c7e9440e71dd36253fa972987911c78f30" exitCode=0 Dec 03 15:09:09 crc kubenswrapper[4751]: I1203 15:09:09.563370 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grmct" event={"ID":"012cc9a0-f727-4f96-ac42-6b1f8c69236a","Type":"ContainerDied","Data":"9f77aa8786e769804f739bd3883dc8c7e9440e71dd36253fa972987911c78f30"} Dec 03 15:09:10 crc kubenswrapper[4751]: I1203 15:09:10.313766 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:09:10 crc kubenswrapper[4751]: E1203 15:09:10.314359 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.304179 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-grmct" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.345790 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d559m\" (UniqueName: \"kubernetes.io/projected/012cc9a0-f727-4f96-ac42-6b1f8c69236a-kube-api-access-d559m\") pod \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.345858 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-catalog-content\") pod \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.345939 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-utilities\") pod \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\" (UID: \"012cc9a0-f727-4f96-ac42-6b1f8c69236a\") " Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.346983 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-utilities" (OuterVolumeSpecName: "utilities") pod "012cc9a0-f727-4f96-ac42-6b1f8c69236a" (UID: "012cc9a0-f727-4f96-ac42-6b1f8c69236a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.355582 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/012cc9a0-f727-4f96-ac42-6b1f8c69236a-kube-api-access-d559m" (OuterVolumeSpecName: "kube-api-access-d559m") pod "012cc9a0-f727-4f96-ac42-6b1f8c69236a" (UID: "012cc9a0-f727-4f96-ac42-6b1f8c69236a"). InnerVolumeSpecName "kube-api-access-d559m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.404155 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "012cc9a0-f727-4f96-ac42-6b1f8c69236a" (UID: "012cc9a0-f727-4f96-ac42-6b1f8c69236a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.449061 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d559m\" (UniqueName: \"kubernetes.io/projected/012cc9a0-f727-4f96-ac42-6b1f8c69236a-kube-api-access-d559m\") on node \"crc\" DevicePath \"\"" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.449128 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.449145 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012cc9a0-f727-4f96-ac42-6b1f8c69236a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.581452 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grmct" event={"ID":"012cc9a0-f727-4f96-ac42-6b1f8c69236a","Type":"ContainerDied","Data":"7ebbc4a13e106eac890e712c870da9f40d23cb41b8de9a11b8177831ab1657e2"} Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.581512 4751 scope.go:117] "RemoveContainer" containerID="9f77aa8786e769804f739bd3883dc8c7e9440e71dd36253fa972987911c78f30" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.581555 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-grmct" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.608907 4751 scope.go:117] "RemoveContainer" containerID="2dd4a275cc76f50775338d8710573ef6cd111ffb70911e56cf591622c2e42c0a" Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.612991 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-grmct"] Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.624081 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-grmct"] Dec 03 15:09:11 crc kubenswrapper[4751]: I1203 15:09:11.641408 4751 scope.go:117] "RemoveContainer" containerID="34eabaabd3071751fde1bc9f8e4fd39b37b06a2ff03e64b9793f54eddf5ce232" Dec 03 15:09:13 crc kubenswrapper[4751]: I1203 15:09:13.335622 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" path="/var/lib/kubelet/pods/012cc9a0-f727-4f96-ac42-6b1f8c69236a/volumes" Dec 03 15:09:21 crc kubenswrapper[4751]: I1203 15:09:21.314590 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:09:21 crc kubenswrapper[4751]: E1203 15:09:21.315656 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:09:25 crc kubenswrapper[4751]: I1203 15:09:25.562014 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-xg9ml" podUID="e2d4448e-9181-494b-bec0-12da338b184d" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.119:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 15:09:25 crc kubenswrapper[4751]: I1203 15:09:25.750982 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="3dc63449-cac9-48bc-abb7-3ff350a408cf" containerName="galera" probeResult="failure" output="command timed out" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.086318 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ttmtj"] Dec 03 15:09:35 crc kubenswrapper[4751]: E1203 15:09:35.087650 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerName="registry-server" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.087667 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerName="registry-server" Dec 03 15:09:35 crc kubenswrapper[4751]: E1203 15:09:35.087692 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerName="extract-utilities" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.087701 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerName="extract-utilities" Dec 03 15:09:35 crc kubenswrapper[4751]: E1203 15:09:35.087729 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerName="extract-content" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.087737 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerName="extract-content" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.088075 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="012cc9a0-f727-4f96-ac42-6b1f8c69236a" containerName="registry-server" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.090158 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.111972 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttmtj"] Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.295974 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp2zt\" (UniqueName: \"kubernetes.io/projected/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-kube-api-access-mp2zt\") pod \"redhat-operators-ttmtj\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.296108 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-utilities\") pod \"redhat-operators-ttmtj\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.296296 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-catalog-content\") pod \"redhat-operators-ttmtj\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.398700 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-catalog-content\") pod \"redhat-operators-ttmtj\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.399103 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp2zt\" (UniqueName: \"kubernetes.io/projected/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-kube-api-access-mp2zt\") pod \"redhat-operators-ttmtj\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.399237 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-catalog-content\") pod \"redhat-operators-ttmtj\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.399256 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-utilities\") pod \"redhat-operators-ttmtj\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.399638 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-utilities\") pod \"redhat-operators-ttmtj\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.428174 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp2zt\" (UniqueName: \"kubernetes.io/projected/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-kube-api-access-mp2zt\") pod \"redhat-operators-ttmtj\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:35 crc kubenswrapper[4751]: I1203 15:09:35.715484 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:36 crc kubenswrapper[4751]: I1203 15:09:36.314448 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:09:36 crc kubenswrapper[4751]: E1203 15:09:36.315060 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:09:36 crc kubenswrapper[4751]: I1203 15:09:36.617512 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttmtj"] Dec 03 15:09:36 crc kubenswrapper[4751]: I1203 15:09:36.875342 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttmtj" event={"ID":"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b","Type":"ContainerStarted","Data":"bb258b31f78bbdb368739fbdadd91b1dbb18590855f9ea424d76ae8b3c8d7c21"} Dec 03 15:09:38 crc kubenswrapper[4751]: I1203 15:09:38.898124 4751 generic.go:334] "Generic (PLEG): container finished" podID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerID="f0f72d1343d5eee84b9e2e6d7c5e4f764808828c6f1db735ce8cb662e91769da" exitCode=0 Dec 03 15:09:38 crc kubenswrapper[4751]: I1203 15:09:38.898207 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttmtj" event={"ID":"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b","Type":"ContainerDied","Data":"f0f72d1343d5eee84b9e2e6d7c5e4f764808828c6f1db735ce8cb662e91769da"} Dec 03 15:09:40 crc kubenswrapper[4751]: I1203 15:09:40.918371 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttmtj" event={"ID":"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b","Type":"ContainerStarted","Data":"79802ef28cfb47e82fee21931e312158e5944413cabcc3728d1a66ede2e49839"} Dec 03 15:09:49 crc kubenswrapper[4751]: I1203 15:09:49.001425 4751 generic.go:334] "Generic (PLEG): container finished" podID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerID="79802ef28cfb47e82fee21931e312158e5944413cabcc3728d1a66ede2e49839" exitCode=0 Dec 03 15:09:49 crc kubenswrapper[4751]: I1203 15:09:49.001657 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttmtj" event={"ID":"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b","Type":"ContainerDied","Data":"79802ef28cfb47e82fee21931e312158e5944413cabcc3728d1a66ede2e49839"} Dec 03 15:09:49 crc kubenswrapper[4751]: I1203 15:09:49.314410 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:09:49 crc kubenswrapper[4751]: E1203 15:09:49.314691 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:09:51 crc kubenswrapper[4751]: I1203 15:09:51.025161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttmtj" event={"ID":"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b","Type":"ContainerStarted","Data":"64b8fbe0e3d556835b636f8739cd0bdc624e6795adc6d1138543d1e3809d052b"} Dec 03 15:09:51 crc kubenswrapper[4751]: I1203 15:09:51.059784 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ttmtj" podStartSLOduration=4.78562395 podStartE2EDuration="16.059761416s" podCreationTimestamp="2025-12-03 15:09:35 +0000 UTC" firstStartedPulling="2025-12-03 15:09:38.904289197 +0000 UTC m=+3385.892644414" lastFinishedPulling="2025-12-03 15:09:50.178426663 +0000 UTC m=+3397.166781880" observedRunningTime="2025-12-03 15:09:51.052473482 +0000 UTC m=+3398.040828719" watchObservedRunningTime="2025-12-03 15:09:51.059761416 +0000 UTC m=+3398.048116633" Dec 03 15:09:55 crc kubenswrapper[4751]: I1203 15:09:55.716008 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:55 crc kubenswrapper[4751]: I1203 15:09:55.716557 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:55 crc kubenswrapper[4751]: I1203 15:09:55.803066 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:56 crc kubenswrapper[4751]: I1203 15:09:56.129114 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:56 crc kubenswrapper[4751]: I1203 15:09:56.870131 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttmtj"] Dec 03 15:09:58 crc kubenswrapper[4751]: I1203 15:09:58.106214 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ttmtj" podUID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerName="registry-server" containerID="cri-o://64b8fbe0e3d556835b636f8739cd0bdc624e6795adc6d1138543d1e3809d052b" gracePeriod=2 Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.119342 4751 generic.go:334] "Generic (PLEG): container finished" podID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerID="64b8fbe0e3d556835b636f8739cd0bdc624e6795adc6d1138543d1e3809d052b" exitCode=0 Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.119434 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttmtj" event={"ID":"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b","Type":"ContainerDied","Data":"64b8fbe0e3d556835b636f8739cd0bdc624e6795adc6d1138543d1e3809d052b"} Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.329069 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.487395 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp2zt\" (UniqueName: \"kubernetes.io/projected/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-kube-api-access-mp2zt\") pod \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.487724 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-catalog-content\") pod \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.487830 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-utilities\") pod \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\" (UID: \"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b\") " Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.488741 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-utilities" (OuterVolumeSpecName: "utilities") pod "12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" (UID: "12ce0f9d-eae7-49ca-8289-4658a1fc2c0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.489477 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.496102 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-kube-api-access-mp2zt" (OuterVolumeSpecName: "kube-api-access-mp2zt") pod "12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" (UID: "12ce0f9d-eae7-49ca-8289-4658a1fc2c0b"). InnerVolumeSpecName "kube-api-access-mp2zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.591686 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp2zt\" (UniqueName: \"kubernetes.io/projected/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-kube-api-access-mp2zt\") on node \"crc\" DevicePath \"\"" Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.611433 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" (UID: "12ce0f9d-eae7-49ca-8289-4658a1fc2c0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:09:59 crc kubenswrapper[4751]: I1203 15:09:59.694172 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:10:00 crc kubenswrapper[4751]: I1203 15:10:00.131943 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttmtj" event={"ID":"12ce0f9d-eae7-49ca-8289-4658a1fc2c0b","Type":"ContainerDied","Data":"bb258b31f78bbdb368739fbdadd91b1dbb18590855f9ea424d76ae8b3c8d7c21"} Dec 03 15:10:00 crc kubenswrapper[4751]: I1203 15:10:00.131989 4751 scope.go:117] "RemoveContainer" containerID="64b8fbe0e3d556835b636f8739cd0bdc624e6795adc6d1138543d1e3809d052b" Dec 03 15:10:00 crc kubenswrapper[4751]: I1203 15:10:00.132123 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttmtj" Dec 03 15:10:00 crc kubenswrapper[4751]: I1203 15:10:00.168130 4751 scope.go:117] "RemoveContainer" containerID="79802ef28cfb47e82fee21931e312158e5944413cabcc3728d1a66ede2e49839" Dec 03 15:10:00 crc kubenswrapper[4751]: I1203 15:10:00.183423 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttmtj"] Dec 03 15:10:00 crc kubenswrapper[4751]: I1203 15:10:00.192926 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ttmtj"] Dec 03 15:10:00 crc kubenswrapper[4751]: I1203 15:10:00.193491 4751 scope.go:117] "RemoveContainer" containerID="f0f72d1343d5eee84b9e2e6d7c5e4f764808828c6f1db735ce8cb662e91769da" Dec 03 15:10:01 crc kubenswrapper[4751]: I1203 15:10:01.313906 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:10:01 crc kubenswrapper[4751]: E1203 15:10:01.314443 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:10:01 crc kubenswrapper[4751]: I1203 15:10:01.327561 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" path="/var/lib/kubelet/pods/12ce0f9d-eae7-49ca-8289-4658a1fc2c0b/volumes" Dec 03 15:10:13 crc kubenswrapper[4751]: I1203 15:10:13.321822 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:10:13 crc kubenswrapper[4751]: E1203 15:10:13.322837 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:10:25 crc kubenswrapper[4751]: I1203 15:10:25.313609 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:10:25 crc kubenswrapper[4751]: E1203 15:10:25.314235 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:10:37 crc kubenswrapper[4751]: I1203 15:10:37.314351 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:10:38 crc kubenswrapper[4751]: I1203 15:10:38.520902 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"743940d32519162efebff73b0fdcb4fad05561eaf21e4f59daea96fbcd85c9b2"} Dec 03 15:13:05 crc kubenswrapper[4751]: I1203 15:13:05.820315 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:13:05 crc kubenswrapper[4751]: I1203 15:13:05.820802 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:13:35 crc kubenswrapper[4751]: I1203 15:13:35.819801 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:13:35 crc kubenswrapper[4751]: I1203 15:13:35.820417 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:14:05 crc kubenswrapper[4751]: I1203 15:14:05.820675 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:14:05 crc kubenswrapper[4751]: I1203 15:14:05.821865 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:14:05 crc kubenswrapper[4751]: I1203 15:14:05.821955 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 15:14:05 crc kubenswrapper[4751]: I1203 15:14:05.822963 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"743940d32519162efebff73b0fdcb4fad05561eaf21e4f59daea96fbcd85c9b2"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:14:05 crc kubenswrapper[4751]: I1203 15:14:05.823023 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://743940d32519162efebff73b0fdcb4fad05561eaf21e4f59daea96fbcd85c9b2" gracePeriod=600 Dec 03 15:14:06 crc kubenswrapper[4751]: I1203 15:14:06.599980 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="743940d32519162efebff73b0fdcb4fad05561eaf21e4f59daea96fbcd85c9b2" exitCode=0 Dec 03 15:14:06 crc kubenswrapper[4751]: I1203 15:14:06.600058 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"743940d32519162efebff73b0fdcb4fad05561eaf21e4f59daea96fbcd85c9b2"} Dec 03 15:14:06 crc kubenswrapper[4751]: I1203 15:14:06.600270 4751 scope.go:117] "RemoveContainer" containerID="72aea6609420a53011228084555007c4999bb4a58225ec17805343a049fdc2f3" Dec 03 15:14:07 crc kubenswrapper[4751]: I1203 15:14:07.611720 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda"} Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.157197 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw"] Dec 03 15:15:00 crc kubenswrapper[4751]: E1203 15:15:00.158365 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerName="registry-server" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.158383 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerName="registry-server" Dec 03 15:15:00 crc kubenswrapper[4751]: E1203 15:15:00.158406 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerName="extract-utilities" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.158414 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerName="extract-utilities" Dec 03 15:15:00 crc kubenswrapper[4751]: E1203 15:15:00.158433 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerName="extract-content" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.158441 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerName="extract-content" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.158753 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ce0f9d-eae7-49ca-8289-4658a1fc2c0b" containerName="registry-server" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.159804 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.164720 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.165099 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.194052 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw"] Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.199011 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2577323f-107d-47bf-9489-c843ec0a0dfa-secret-volume\") pod \"collect-profiles-29412915-zkjpw\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.199131 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c84fp\" (UniqueName: \"kubernetes.io/projected/2577323f-107d-47bf-9489-c843ec0a0dfa-kube-api-access-c84fp\") pod \"collect-profiles-29412915-zkjpw\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.199162 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2577323f-107d-47bf-9489-c843ec0a0dfa-config-volume\") pod \"collect-profiles-29412915-zkjpw\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.301647 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2577323f-107d-47bf-9489-c843ec0a0dfa-secret-volume\") pod \"collect-profiles-29412915-zkjpw\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.301789 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c84fp\" (UniqueName: \"kubernetes.io/projected/2577323f-107d-47bf-9489-c843ec0a0dfa-kube-api-access-c84fp\") pod \"collect-profiles-29412915-zkjpw\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.301815 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2577323f-107d-47bf-9489-c843ec0a0dfa-config-volume\") pod \"collect-profiles-29412915-zkjpw\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.303099 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2577323f-107d-47bf-9489-c843ec0a0dfa-config-volume\") pod \"collect-profiles-29412915-zkjpw\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.319356 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2577323f-107d-47bf-9489-c843ec0a0dfa-secret-volume\") pod \"collect-profiles-29412915-zkjpw\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.319828 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c84fp\" (UniqueName: \"kubernetes.io/projected/2577323f-107d-47bf-9489-c843ec0a0dfa-kube-api-access-c84fp\") pod \"collect-profiles-29412915-zkjpw\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.485671 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:00 crc kubenswrapper[4751]: I1203 15:15:00.985653 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw"] Dec 03 15:15:01 crc kubenswrapper[4751]: I1203 15:15:01.195522 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" event={"ID":"2577323f-107d-47bf-9489-c843ec0a0dfa","Type":"ContainerStarted","Data":"9fd07721fd6cb9a9e6c0e71a94ce8cf6864adb86b048c85b672bf5487484dfce"} Dec 03 15:15:02 crc kubenswrapper[4751]: I1203 15:15:02.209588 4751 generic.go:334] "Generic (PLEG): container finished" podID="2577323f-107d-47bf-9489-c843ec0a0dfa" containerID="ca3624d2680e0080d870598086e0519d021673aa08fd2b0395816dc383d529c9" exitCode=0 Dec 03 15:15:02 crc kubenswrapper[4751]: I1203 15:15:02.209722 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" event={"ID":"2577323f-107d-47bf-9489-c843ec0a0dfa","Type":"ContainerDied","Data":"ca3624d2680e0080d870598086e0519d021673aa08fd2b0395816dc383d529c9"} Dec 03 15:15:03 crc kubenswrapper[4751]: I1203 15:15:03.909251 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:03 crc kubenswrapper[4751]: I1203 15:15:03.980864 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2577323f-107d-47bf-9489-c843ec0a0dfa-config-volume\") pod \"2577323f-107d-47bf-9489-c843ec0a0dfa\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " Dec 03 15:15:03 crc kubenswrapper[4751]: I1203 15:15:03.980956 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c84fp\" (UniqueName: \"kubernetes.io/projected/2577323f-107d-47bf-9489-c843ec0a0dfa-kube-api-access-c84fp\") pod \"2577323f-107d-47bf-9489-c843ec0a0dfa\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " Dec 03 15:15:03 crc kubenswrapper[4751]: I1203 15:15:03.981100 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2577323f-107d-47bf-9489-c843ec0a0dfa-secret-volume\") pod \"2577323f-107d-47bf-9489-c843ec0a0dfa\" (UID: \"2577323f-107d-47bf-9489-c843ec0a0dfa\") " Dec 03 15:15:03 crc kubenswrapper[4751]: I1203 15:15:03.981606 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2577323f-107d-47bf-9489-c843ec0a0dfa-config-volume" (OuterVolumeSpecName: "config-volume") pod "2577323f-107d-47bf-9489-c843ec0a0dfa" (UID: "2577323f-107d-47bf-9489-c843ec0a0dfa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:15:03 crc kubenswrapper[4751]: I1203 15:15:03.981809 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2577323f-107d-47bf-9489-c843ec0a0dfa-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:03 crc kubenswrapper[4751]: I1203 15:15:03.986749 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2577323f-107d-47bf-9489-c843ec0a0dfa-kube-api-access-c84fp" (OuterVolumeSpecName: "kube-api-access-c84fp") pod "2577323f-107d-47bf-9489-c843ec0a0dfa" (UID: "2577323f-107d-47bf-9489-c843ec0a0dfa"). InnerVolumeSpecName "kube-api-access-c84fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:15:03 crc kubenswrapper[4751]: I1203 15:15:03.986795 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2577323f-107d-47bf-9489-c843ec0a0dfa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2577323f-107d-47bf-9489-c843ec0a0dfa" (UID: "2577323f-107d-47bf-9489-c843ec0a0dfa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:15:04 crc kubenswrapper[4751]: I1203 15:15:04.083920 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c84fp\" (UniqueName: \"kubernetes.io/projected/2577323f-107d-47bf-9489-c843ec0a0dfa-kube-api-access-c84fp\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:04 crc kubenswrapper[4751]: I1203 15:15:04.083961 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2577323f-107d-47bf-9489-c843ec0a0dfa-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:04 crc kubenswrapper[4751]: I1203 15:15:04.228586 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" event={"ID":"2577323f-107d-47bf-9489-c843ec0a0dfa","Type":"ContainerDied","Data":"9fd07721fd6cb9a9e6c0e71a94ce8cf6864adb86b048c85b672bf5487484dfce"} Dec 03 15:15:04 crc kubenswrapper[4751]: I1203 15:15:04.228627 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fd07721fd6cb9a9e6c0e71a94ce8cf6864adb86b048c85b672bf5487484dfce" Dec 03 15:15:04 crc kubenswrapper[4751]: I1203 15:15:04.228693 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-zkjpw" Dec 03 15:15:04 crc kubenswrapper[4751]: I1203 15:15:04.985903 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm"] Dec 03 15:15:04 crc kubenswrapper[4751]: I1203 15:15:04.996106 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-7n4fm"] Dec 03 15:15:05 crc kubenswrapper[4751]: I1203 15:15:05.331576 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77fff4ad-5068-4b2b-a2ad-b700d3dc06e1" path="/var/lib/kubelet/pods/77fff4ad-5068-4b2b-a2ad-b700d3dc06e1/volumes" Dec 03 15:15:29 crc kubenswrapper[4751]: I1203 15:15:29.290303 4751 scope.go:117] "RemoveContainer" containerID="5b2d03adb8143132bb0e6465f53949ad884338332ac035d6d13370cbe72635d9" Dec 03 15:15:55 crc kubenswrapper[4751]: I1203 15:15:55.754043 4751 generic.go:334] "Generic (PLEG): container finished" podID="aa32bbce-059c-46e3-a8d7-f737d93e394e" containerID="ba875bd06540e505d8682a589a909ad6c1dbb922fa7628c68cd5aeb250776596" exitCode=0 Dec 03 15:15:55 crc kubenswrapper[4751]: I1203 15:15:55.754259 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aa32bbce-059c-46e3-a8d7-f737d93e394e","Type":"ContainerDied","Data":"ba875bd06540e505d8682a589a909ad6c1dbb922fa7628c68cd5aeb250776596"} Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.331809 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.396193 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ssh-key\") pod \"aa32bbce-059c-46e3-a8d7-f737d93e394e\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.396297 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config\") pod \"aa32bbce-059c-46e3-a8d7-f737d93e394e\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.396350 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config-secret\") pod \"aa32bbce-059c-46e3-a8d7-f737d93e394e\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.396480 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"aa32bbce-059c-46e3-a8d7-f737d93e394e\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.396531 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ca-certs\") pod \"aa32bbce-059c-46e3-a8d7-f737d93e394e\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.396596 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl65m\" (UniqueName: \"kubernetes.io/projected/aa32bbce-059c-46e3-a8d7-f737d93e394e-kube-api-access-gl65m\") pod \"aa32bbce-059c-46e3-a8d7-f737d93e394e\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.396648 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-workdir\") pod \"aa32bbce-059c-46e3-a8d7-f737d93e394e\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.396699 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-temporary\") pod \"aa32bbce-059c-46e3-a8d7-f737d93e394e\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.396805 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-config-data\") pod \"aa32bbce-059c-46e3-a8d7-f737d93e394e\" (UID: \"aa32bbce-059c-46e3-a8d7-f737d93e394e\") " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.399393 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-config-data" (OuterVolumeSpecName: "config-data") pod "aa32bbce-059c-46e3-a8d7-f737d93e394e" (UID: "aa32bbce-059c-46e3-a8d7-f737d93e394e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.400396 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "aa32bbce-059c-46e3-a8d7-f737d93e394e" (UID: "aa32bbce-059c-46e3-a8d7-f737d93e394e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.404721 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "aa32bbce-059c-46e3-a8d7-f737d93e394e" (UID: "aa32bbce-059c-46e3-a8d7-f737d93e394e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.405886 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa32bbce-059c-46e3-a8d7-f737d93e394e-kube-api-access-gl65m" (OuterVolumeSpecName: "kube-api-access-gl65m") pod "aa32bbce-059c-46e3-a8d7-f737d93e394e" (UID: "aa32bbce-059c-46e3-a8d7-f737d93e394e"). InnerVolumeSpecName "kube-api-access-gl65m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.450524 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa32bbce-059c-46e3-a8d7-f737d93e394e" (UID: "aa32bbce-059c-46e3-a8d7-f737d93e394e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.457986 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "aa32bbce-059c-46e3-a8d7-f737d93e394e" (UID: "aa32bbce-059c-46e3-a8d7-f737d93e394e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.467546 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "aa32bbce-059c-46e3-a8d7-f737d93e394e" (UID: "aa32bbce-059c-46e3-a8d7-f737d93e394e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.497148 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "aa32bbce-059c-46e3-a8d7-f737d93e394e" (UID: "aa32bbce-059c-46e3-a8d7-f737d93e394e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.498930 4751 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.498955 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.498966 4751 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.498974 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.498983 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.499011 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.499020 4751 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa32bbce-059c-46e3-a8d7-f737d93e394e-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.499028 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl65m\" (UniqueName: \"kubernetes.io/projected/aa32bbce-059c-46e3-a8d7-f737d93e394e-kube-api-access-gl65m\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.530219 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.601135 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.785543 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aa32bbce-059c-46e3-a8d7-f737d93e394e","Type":"ContainerDied","Data":"ad016f93387d4cef09ffab642fa4a34ce3dc40f4d54d3e50d68db4b935093c23"} Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.785601 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad016f93387d4cef09ffab642fa4a34ce3dc40f4d54d3e50d68db4b935093c23" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.785675 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.852553 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "aa32bbce-059c-46e3-a8d7-f737d93e394e" (UID: "aa32bbce-059c-46e3-a8d7-f737d93e394e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:15:57 crc kubenswrapper[4751]: I1203 15:15:57.907934 4751 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa32bbce-059c-46e3-a8d7-f737d93e394e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.106532 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 15:16:00 crc kubenswrapper[4751]: E1203 15:16:00.107701 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa32bbce-059c-46e3-a8d7-f737d93e394e" containerName="tempest-tests-tempest-tests-runner" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.107723 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa32bbce-059c-46e3-a8d7-f737d93e394e" containerName="tempest-tests-tempest-tests-runner" Dec 03 15:16:00 crc kubenswrapper[4751]: E1203 15:16:00.107755 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2577323f-107d-47bf-9489-c843ec0a0dfa" containerName="collect-profiles" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.107763 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2577323f-107d-47bf-9489-c843ec0a0dfa" containerName="collect-profiles" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.108011 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2577323f-107d-47bf-9489-c843ec0a0dfa" containerName="collect-profiles" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.108038 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa32bbce-059c-46e3-a8d7-f737d93e394e" containerName="tempest-tests-tempest-tests-runner" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.109055 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.111543 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vsb8b" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.141451 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.159763 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"14cd5bf7-18fc-450b-b4b1-f87bf154efeb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.159856 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjp6\" (UniqueName: \"kubernetes.io/projected/14cd5bf7-18fc-450b-b4b1-f87bf154efeb-kube-api-access-lqjp6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"14cd5bf7-18fc-450b-b4b1-f87bf154efeb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.261387 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjp6\" (UniqueName: \"kubernetes.io/projected/14cd5bf7-18fc-450b-b4b1-f87bf154efeb-kube-api-access-lqjp6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"14cd5bf7-18fc-450b-b4b1-f87bf154efeb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.261634 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"14cd5bf7-18fc-450b-b4b1-f87bf154efeb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.261959 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"14cd5bf7-18fc-450b-b4b1-f87bf154efeb\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.291102 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjp6\" (UniqueName: \"kubernetes.io/projected/14cd5bf7-18fc-450b-b4b1-f87bf154efeb-kube-api-access-lqjp6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"14cd5bf7-18fc-450b-b4b1-f87bf154efeb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.293950 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"14cd5bf7-18fc-450b-b4b1-f87bf154efeb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.428936 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.928457 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 15:16:00 crc kubenswrapper[4751]: I1203 15:16:00.934381 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 15:16:01 crc kubenswrapper[4751]: I1203 15:16:01.827358 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"14cd5bf7-18fc-450b-b4b1-f87bf154efeb","Type":"ContainerStarted","Data":"9ae0b10ffb3293e75bb0650fc41dbe68f3fafba7eaca57bbef45b3abe7931b36"} Dec 03 15:16:02 crc kubenswrapper[4751]: I1203 15:16:02.838412 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"14cd5bf7-18fc-450b-b4b1-f87bf154efeb","Type":"ContainerStarted","Data":"c0fd3794560c43a4d62db36062a63fcfed8e67f87f1a1cacc618b107576e85f4"} Dec 03 15:16:02 crc kubenswrapper[4751]: I1203 15:16:02.850386 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.767171157 podStartE2EDuration="2.850367661s" podCreationTimestamp="2025-12-03 15:16:00 +0000 UTC" firstStartedPulling="2025-12-03 15:16:00.934083129 +0000 UTC m=+3767.922438336" lastFinishedPulling="2025-12-03 15:16:02.017279623 +0000 UTC m=+3769.005634840" observedRunningTime="2025-12-03 15:16:02.849667932 +0000 UTC m=+3769.838023149" watchObservedRunningTime="2025-12-03 15:16:02.850367661 +0000 UTC m=+3769.838722878" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.207563 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fxk5b/must-gather-krlcs"] Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.211503 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.213929 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fxk5b"/"openshift-service-ca.crt" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.214137 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fxk5b"/"kube-root-ca.crt" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.214137 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fxk5b"/"default-dockercfg-bljnm" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.235873 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fxk5b/must-gather-krlcs"] Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.315082 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/108aeb3c-d5f1-4cab-80d4-9c55592975d6-must-gather-output\") pod \"must-gather-krlcs\" (UID: \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\") " pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.315464 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-988s8\" (UniqueName: \"kubernetes.io/projected/108aeb3c-d5f1-4cab-80d4-9c55592975d6-kube-api-access-988s8\") pod \"must-gather-krlcs\" (UID: \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\") " pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.417849 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/108aeb3c-d5f1-4cab-80d4-9c55592975d6-must-gather-output\") pod \"must-gather-krlcs\" (UID: \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\") " pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.417952 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-988s8\" (UniqueName: \"kubernetes.io/projected/108aeb3c-d5f1-4cab-80d4-9c55592975d6-kube-api-access-988s8\") pod \"must-gather-krlcs\" (UID: \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\") " pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.419408 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/108aeb3c-d5f1-4cab-80d4-9c55592975d6-must-gather-output\") pod \"must-gather-krlcs\" (UID: \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\") " pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.455844 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-988s8\" (UniqueName: \"kubernetes.io/projected/108aeb3c-d5f1-4cab-80d4-9c55592975d6-kube-api-access-988s8\") pod \"must-gather-krlcs\" (UID: \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\") " pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:16:26 crc kubenswrapper[4751]: I1203 15:16:26.538829 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:16:27 crc kubenswrapper[4751]: I1203 15:16:27.087569 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fxk5b/must-gather-krlcs"] Dec 03 15:16:28 crc kubenswrapper[4751]: I1203 15:16:28.096830 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/must-gather-krlcs" event={"ID":"108aeb3c-d5f1-4cab-80d4-9c55592975d6","Type":"ContainerStarted","Data":"61a8cd9681a176f8346db920c07988c277cb69f66fa02eb9b0671872e707c00f"} Dec 03 15:16:33 crc kubenswrapper[4751]: I1203 15:16:33.151759 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/must-gather-krlcs" event={"ID":"108aeb3c-d5f1-4cab-80d4-9c55592975d6","Type":"ContainerStarted","Data":"515e4af780d2dc2ee62c0c2305d150b0f8f7e3846b72d153c0ca2219a888b465"} Dec 03 15:16:33 crc kubenswrapper[4751]: I1203 15:16:33.152347 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/must-gather-krlcs" event={"ID":"108aeb3c-d5f1-4cab-80d4-9c55592975d6","Type":"ContainerStarted","Data":"575faeab859105060ca850e0f055102c5bdbe3579fcd1bde1217cbe6b430e554"} Dec 03 15:16:33 crc kubenswrapper[4751]: I1203 15:16:33.177915 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fxk5b/must-gather-krlcs" podStartSLOduration=2.084049929 podStartE2EDuration="7.177891512s" podCreationTimestamp="2025-12-03 15:16:26 +0000 UTC" firstStartedPulling="2025-12-03 15:16:27.096096134 +0000 UTC m=+3794.084451351" lastFinishedPulling="2025-12-03 15:16:32.189937727 +0000 UTC m=+3799.178292934" observedRunningTime="2025-12-03 15:16:33.167629028 +0000 UTC m=+3800.155984255" watchObservedRunningTime="2025-12-03 15:16:33.177891512 +0000 UTC m=+3800.166246729" Dec 03 15:16:35 crc kubenswrapper[4751]: I1203 15:16:35.820284 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:16:35 crc kubenswrapper[4751]: I1203 15:16:35.820931 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:16:37 crc kubenswrapper[4751]: I1203 15:16:37.030997 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fxk5b/crc-debug-vcdch"] Dec 03 15:16:37 crc kubenswrapper[4751]: I1203 15:16:37.032913 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:16:37 crc kubenswrapper[4751]: I1203 15:16:37.162672 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-host\") pod \"crc-debug-vcdch\" (UID: \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\") " pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:16:37 crc kubenswrapper[4751]: I1203 15:16:37.163704 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lxk8\" (UniqueName: \"kubernetes.io/projected/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-kube-api-access-5lxk8\") pod \"crc-debug-vcdch\" (UID: \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\") " pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:16:37 crc kubenswrapper[4751]: I1203 15:16:37.266491 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lxk8\" (UniqueName: \"kubernetes.io/projected/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-kube-api-access-5lxk8\") pod \"crc-debug-vcdch\" (UID: \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\") " pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:16:37 crc kubenswrapper[4751]: I1203 15:16:37.267165 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-host\") pod \"crc-debug-vcdch\" (UID: \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\") " pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:16:37 crc kubenswrapper[4751]: I1203 15:16:37.267638 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-host\") pod \"crc-debug-vcdch\" (UID: \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\") " pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:16:37 crc kubenswrapper[4751]: I1203 15:16:37.305834 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lxk8\" (UniqueName: \"kubernetes.io/projected/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-kube-api-access-5lxk8\") pod \"crc-debug-vcdch\" (UID: \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\") " pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:16:37 crc kubenswrapper[4751]: I1203 15:16:37.355310 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:16:38 crc kubenswrapper[4751]: I1203 15:16:38.217131 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/crc-debug-vcdch" event={"ID":"4889807b-d17b-4fca-ab50-a0cf35e3a7d2","Type":"ContainerStarted","Data":"9c4673ae0fa8d2eb389bbd80f71702d77066564a3202e713f824ad6084cd9f15"} Dec 03 15:16:52 crc kubenswrapper[4751]: I1203 15:16:52.379519 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/crc-debug-vcdch" event={"ID":"4889807b-d17b-4fca-ab50-a0cf35e3a7d2","Type":"ContainerStarted","Data":"cc9019bd2ae46b5235e7049d7f60987117fd365b6052e42974bbcc8515d1269d"} Dec 03 15:16:52 crc kubenswrapper[4751]: I1203 15:16:52.417910 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fxk5b/crc-debug-vcdch" podStartSLOduration=1.08292098 podStartE2EDuration="15.417887657s" podCreationTimestamp="2025-12-03 15:16:37 +0000 UTC" firstStartedPulling="2025-12-03 15:16:37.407463284 +0000 UTC m=+3804.395818501" lastFinishedPulling="2025-12-03 15:16:51.742429961 +0000 UTC m=+3818.730785178" observedRunningTime="2025-12-03 15:16:52.402459035 +0000 UTC m=+3819.390814252" watchObservedRunningTime="2025-12-03 15:16:52.417887657 +0000 UTC m=+3819.406242874" Dec 03 15:17:05 crc kubenswrapper[4751]: I1203 15:17:05.822658 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:17:05 crc kubenswrapper[4751]: I1203 15:17:05.823291 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:17:35 crc kubenswrapper[4751]: I1203 15:17:35.820534 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:17:35 crc kubenswrapper[4751]: I1203 15:17:35.821229 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:17:35 crc kubenswrapper[4751]: I1203 15:17:35.821291 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 15:17:35 crc kubenswrapper[4751]: I1203 15:17:35.822240 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:17:35 crc kubenswrapper[4751]: I1203 15:17:35.822308 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" gracePeriod=600 Dec 03 15:17:35 crc kubenswrapper[4751]: E1203 15:17:35.945793 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:17:36 crc kubenswrapper[4751]: I1203 15:17:36.890744 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" exitCode=0 Dec 03 15:17:36 crc kubenswrapper[4751]: I1203 15:17:36.890816 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda"} Dec 03 15:17:36 crc kubenswrapper[4751]: I1203 15:17:36.891040 4751 scope.go:117] "RemoveContainer" containerID="743940d32519162efebff73b0fdcb4fad05561eaf21e4f59daea96fbcd85c9b2" Dec 03 15:17:36 crc kubenswrapper[4751]: I1203 15:17:36.891774 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:17:36 crc kubenswrapper[4751]: E1203 15:17:36.892020 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:17:37 crc kubenswrapper[4751]: I1203 15:17:37.903456 4751 generic.go:334] "Generic (PLEG): container finished" podID="4889807b-d17b-4fca-ab50-a0cf35e3a7d2" containerID="cc9019bd2ae46b5235e7049d7f60987117fd365b6052e42974bbcc8515d1269d" exitCode=0 Dec 03 15:17:37 crc kubenswrapper[4751]: I1203 15:17:37.903536 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/crc-debug-vcdch" event={"ID":"4889807b-d17b-4fca-ab50-a0cf35e3a7d2","Type":"ContainerDied","Data":"cc9019bd2ae46b5235e7049d7f60987117fd365b6052e42974bbcc8515d1269d"} Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.045902 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.064818 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-host\") pod \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\" (UID: \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\") " Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.064943 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-host" (OuterVolumeSpecName: "host") pod "4889807b-d17b-4fca-ab50-a0cf35e3a7d2" (UID: "4889807b-d17b-4fca-ab50-a0cf35e3a7d2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.064979 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lxk8\" (UniqueName: \"kubernetes.io/projected/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-kube-api-access-5lxk8\") pod \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\" (UID: \"4889807b-d17b-4fca-ab50-a0cf35e3a7d2\") " Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.065492 4751 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.081504 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-kube-api-access-5lxk8" (OuterVolumeSpecName: "kube-api-access-5lxk8") pod "4889807b-d17b-4fca-ab50-a0cf35e3a7d2" (UID: "4889807b-d17b-4fca-ab50-a0cf35e3a7d2"). InnerVolumeSpecName "kube-api-access-5lxk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.113375 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fxk5b/crc-debug-vcdch"] Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.128147 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fxk5b/crc-debug-vcdch"] Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.167351 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lxk8\" (UniqueName: \"kubernetes.io/projected/4889807b-d17b-4fca-ab50-a0cf35e3a7d2-kube-api-access-5lxk8\") on node \"crc\" DevicePath \"\"" Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.327167 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4889807b-d17b-4fca-ab50-a0cf35e3a7d2" path="/var/lib/kubelet/pods/4889807b-d17b-4fca-ab50-a0cf35e3a7d2/volumes" Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.925957 4751 scope.go:117] "RemoveContainer" containerID="cc9019bd2ae46b5235e7049d7f60987117fd365b6052e42974bbcc8515d1269d" Dec 03 15:17:39 crc kubenswrapper[4751]: I1203 15:17:39.925989 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-vcdch" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.318420 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fxk5b/crc-debug-hjxd8"] Dec 03 15:17:40 crc kubenswrapper[4751]: E1203 15:17:40.318965 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4889807b-d17b-4fca-ab50-a0cf35e3a7d2" containerName="container-00" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.318982 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4889807b-d17b-4fca-ab50-a0cf35e3a7d2" containerName="container-00" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.319315 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4889807b-d17b-4fca-ab50-a0cf35e3a7d2" containerName="container-00" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.320288 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.392557 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e8a83bb-66a8-4461-92f3-2a930e5ec136-host\") pod \"crc-debug-hjxd8\" (UID: \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\") " pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.393080 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w8vn\" (UniqueName: \"kubernetes.io/projected/9e8a83bb-66a8-4461-92f3-2a930e5ec136-kube-api-access-6w8vn\") pod \"crc-debug-hjxd8\" (UID: \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\") " pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.494839 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w8vn\" (UniqueName: \"kubernetes.io/projected/9e8a83bb-66a8-4461-92f3-2a930e5ec136-kube-api-access-6w8vn\") pod \"crc-debug-hjxd8\" (UID: \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\") " pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.495662 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e8a83bb-66a8-4461-92f3-2a930e5ec136-host\") pod \"crc-debug-hjxd8\" (UID: \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\") " pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.495786 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e8a83bb-66a8-4461-92f3-2a930e5ec136-host\") pod \"crc-debug-hjxd8\" (UID: \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\") " pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.521694 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w8vn\" (UniqueName: \"kubernetes.io/projected/9e8a83bb-66a8-4461-92f3-2a930e5ec136-kube-api-access-6w8vn\") pod \"crc-debug-hjxd8\" (UID: \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\") " pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.638625 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:40 crc kubenswrapper[4751]: I1203 15:17:40.937834 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" event={"ID":"9e8a83bb-66a8-4461-92f3-2a930e5ec136","Type":"ContainerStarted","Data":"6b37cef0ac040c3a44f364589db349783b11ca9d60eb9213500b483959cfede0"} Dec 03 15:17:41 crc kubenswrapper[4751]: I1203 15:17:41.951663 4751 generic.go:334] "Generic (PLEG): container finished" podID="9e8a83bb-66a8-4461-92f3-2a930e5ec136" containerID="180fbf6d213a958ced66aeede507c281f64e79ee3bcc657cda55233bd5be472e" exitCode=0 Dec 03 15:17:41 crc kubenswrapper[4751]: I1203 15:17:41.951780 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" event={"ID":"9e8a83bb-66a8-4461-92f3-2a930e5ec136","Type":"ContainerDied","Data":"180fbf6d213a958ced66aeede507c281f64e79ee3bcc657cda55233bd5be472e"} Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.126147 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.153042 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w8vn\" (UniqueName: \"kubernetes.io/projected/9e8a83bb-66a8-4461-92f3-2a930e5ec136-kube-api-access-6w8vn\") pod \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\" (UID: \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\") " Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.153235 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e8a83bb-66a8-4461-92f3-2a930e5ec136-host\") pod \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\" (UID: \"9e8a83bb-66a8-4461-92f3-2a930e5ec136\") " Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.158432 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e8a83bb-66a8-4461-92f3-2a930e5ec136-host" (OuterVolumeSpecName: "host") pod "9e8a83bb-66a8-4461-92f3-2a930e5ec136" (UID: "9e8a83bb-66a8-4461-92f3-2a930e5ec136"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.161139 4751 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e8a83bb-66a8-4461-92f3-2a930e5ec136-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.189940 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e8a83bb-66a8-4461-92f3-2a930e5ec136-kube-api-access-6w8vn" (OuterVolumeSpecName: "kube-api-access-6w8vn") pod "9e8a83bb-66a8-4461-92f3-2a930e5ec136" (UID: "9e8a83bb-66a8-4461-92f3-2a930e5ec136"). InnerVolumeSpecName "kube-api-access-6w8vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.220320 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fxk5b/crc-debug-hjxd8"] Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.236950 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fxk5b/crc-debug-hjxd8"] Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.262904 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w8vn\" (UniqueName: \"kubernetes.io/projected/9e8a83bb-66a8-4461-92f3-2a930e5ec136-kube-api-access-6w8vn\") on node \"crc\" DevicePath \"\"" Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.328134 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e8a83bb-66a8-4461-92f3-2a930e5ec136" path="/var/lib/kubelet/pods/9e8a83bb-66a8-4461-92f3-2a930e5ec136/volumes" Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.975227 4751 scope.go:117] "RemoveContainer" containerID="180fbf6d213a958ced66aeede507c281f64e79ee3bcc657cda55233bd5be472e" Dec 03 15:17:43 crc kubenswrapper[4751]: I1203 15:17:43.975270 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-hjxd8" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.497839 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fxk5b/crc-debug-4nqzl"] Dec 03 15:17:44 crc kubenswrapper[4751]: E1203 15:17:44.498349 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8a83bb-66a8-4461-92f3-2a930e5ec136" containerName="container-00" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.498368 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8a83bb-66a8-4461-92f3-2a930e5ec136" containerName="container-00" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.498630 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e8a83bb-66a8-4461-92f3-2a930e5ec136" containerName="container-00" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.499441 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.591037 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-host\") pod \"crc-debug-4nqzl\" (UID: \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\") " pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.591808 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95c4s\" (UniqueName: \"kubernetes.io/projected/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-kube-api-access-95c4s\") pod \"crc-debug-4nqzl\" (UID: \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\") " pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.693931 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-host\") pod \"crc-debug-4nqzl\" (UID: \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\") " pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.694223 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95c4s\" (UniqueName: \"kubernetes.io/projected/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-kube-api-access-95c4s\") pod \"crc-debug-4nqzl\" (UID: \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\") " pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.694743 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-host\") pod \"crc-debug-4nqzl\" (UID: \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\") " pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.714940 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95c4s\" (UniqueName: \"kubernetes.io/projected/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-kube-api-access-95c4s\") pod \"crc-debug-4nqzl\" (UID: \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\") " pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.822935 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:44 crc kubenswrapper[4751]: W1203 15:17:44.860383 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7438b02_58fd_4c26_bd46_d0bc9b6440fc.slice/crio-dfbca27a958c9b6e44b56213dfb28c01b05c008ef7c0a50dc00676a41b7e5620 WatchSource:0}: Error finding container dfbca27a958c9b6e44b56213dfb28c01b05c008ef7c0a50dc00676a41b7e5620: Status 404 returned error can't find the container with id dfbca27a958c9b6e44b56213dfb28c01b05c008ef7c0a50dc00676a41b7e5620 Dec 03 15:17:44 crc kubenswrapper[4751]: I1203 15:17:44.987406 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" event={"ID":"c7438b02-58fd-4c26-bd46-d0bc9b6440fc","Type":"ContainerStarted","Data":"dfbca27a958c9b6e44b56213dfb28c01b05c008ef7c0a50dc00676a41b7e5620"} Dec 03 15:17:45 crc kubenswrapper[4751]: E1203 15:17:45.387235 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7438b02_58fd_4c26_bd46_d0bc9b6440fc.slice/crio-53bc0bff493cbfdde50779a39221dd1cb4a1cf06f3bfc20a7addb140a197214c.scope\": RecentStats: unable to find data in memory cache]" Dec 03 15:17:46 crc kubenswrapper[4751]: I1203 15:17:46.001417 4751 generic.go:334] "Generic (PLEG): container finished" podID="c7438b02-58fd-4c26-bd46-d0bc9b6440fc" containerID="53bc0bff493cbfdde50779a39221dd1cb4a1cf06f3bfc20a7addb140a197214c" exitCode=0 Dec 03 15:17:46 crc kubenswrapper[4751]: I1203 15:17:46.001522 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" event={"ID":"c7438b02-58fd-4c26-bd46-d0bc9b6440fc","Type":"ContainerDied","Data":"53bc0bff493cbfdde50779a39221dd1cb4a1cf06f3bfc20a7addb140a197214c"} Dec 03 15:17:46 crc kubenswrapper[4751]: I1203 15:17:46.052412 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fxk5b/crc-debug-4nqzl"] Dec 03 15:17:46 crc kubenswrapper[4751]: I1203 15:17:46.069370 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fxk5b/crc-debug-4nqzl"] Dec 03 15:17:47 crc kubenswrapper[4751]: I1203 15:17:47.163435 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:47 crc kubenswrapper[4751]: I1203 15:17:47.354528 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95c4s\" (UniqueName: \"kubernetes.io/projected/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-kube-api-access-95c4s\") pod \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\" (UID: \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\") " Dec 03 15:17:47 crc kubenswrapper[4751]: I1203 15:17:47.355266 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-host\") pod \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\" (UID: \"c7438b02-58fd-4c26-bd46-d0bc9b6440fc\") " Dec 03 15:17:47 crc kubenswrapper[4751]: I1203 15:17:47.355352 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-host" (OuterVolumeSpecName: "host") pod "c7438b02-58fd-4c26-bd46-d0bc9b6440fc" (UID: "c7438b02-58fd-4c26-bd46-d0bc9b6440fc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:17:47 crc kubenswrapper[4751]: I1203 15:17:47.356429 4751 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:17:47 crc kubenswrapper[4751]: I1203 15:17:47.362142 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-kube-api-access-95c4s" (OuterVolumeSpecName: "kube-api-access-95c4s") pod "c7438b02-58fd-4c26-bd46-d0bc9b6440fc" (UID: "c7438b02-58fd-4c26-bd46-d0bc9b6440fc"). InnerVolumeSpecName "kube-api-access-95c4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:17:47 crc kubenswrapper[4751]: I1203 15:17:47.457756 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95c4s\" (UniqueName: \"kubernetes.io/projected/c7438b02-58fd-4c26-bd46-d0bc9b6440fc-kube-api-access-95c4s\") on node \"crc\" DevicePath \"\"" Dec 03 15:17:48 crc kubenswrapper[4751]: I1203 15:17:48.052972 4751 scope.go:117] "RemoveContainer" containerID="53bc0bff493cbfdde50779a39221dd1cb4a1cf06f3bfc20a7addb140a197214c" Dec 03 15:17:48 crc kubenswrapper[4751]: I1203 15:17:48.053612 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/crc-debug-4nqzl" Dec 03 15:17:48 crc kubenswrapper[4751]: I1203 15:17:48.314399 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:17:48 crc kubenswrapper[4751]: E1203 15:17:48.314824 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:17:49 crc kubenswrapper[4751]: I1203 15:17:49.326272 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7438b02-58fd-4c26-bd46-d0bc9b6440fc" path="/var/lib/kubelet/pods/c7438b02-58fd-4c26-bd46-d0bc9b6440fc/volumes" Dec 03 15:17:51 crc kubenswrapper[4751]: I1203 15:17:51.997357 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-66p2v"] Dec 03 15:17:51 crc kubenswrapper[4751]: E1203 15:17:51.998052 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7438b02-58fd-4c26-bd46-d0bc9b6440fc" containerName="container-00" Dec 03 15:17:51 crc kubenswrapper[4751]: I1203 15:17:51.998063 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7438b02-58fd-4c26-bd46-d0bc9b6440fc" containerName="container-00" Dec 03 15:17:51 crc kubenswrapper[4751]: I1203 15:17:51.998282 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7438b02-58fd-4c26-bd46-d0bc9b6440fc" containerName="container-00" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.001877 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.024642 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66p2v"] Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.055542 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-catalog-content\") pod \"certified-operators-66p2v\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.055607 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrkx\" (UniqueName: \"kubernetes.io/projected/438d1cd9-a242-44f3-bd68-209623426af8-kube-api-access-mxrkx\") pod \"certified-operators-66p2v\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.055706 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-utilities\") pod \"certified-operators-66p2v\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.157053 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-utilities\") pod \"certified-operators-66p2v\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.157182 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-catalog-content\") pod \"certified-operators-66p2v\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.157225 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrkx\" (UniqueName: \"kubernetes.io/projected/438d1cd9-a242-44f3-bd68-209623426af8-kube-api-access-mxrkx\") pod \"certified-operators-66p2v\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.157627 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-utilities\") pod \"certified-operators-66p2v\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.158023 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-catalog-content\") pod \"certified-operators-66p2v\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.179836 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrkx\" (UniqueName: \"kubernetes.io/projected/438d1cd9-a242-44f3-bd68-209623426af8-kube-api-access-mxrkx\") pod \"certified-operators-66p2v\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.325806 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:17:52 crc kubenswrapper[4751]: W1203 15:17:52.894299 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod438d1cd9_a242_44f3_bd68_209623426af8.slice/crio-b06814bc6d12b0f18d30e0141c32712454349f35db1aee59a7c9a0ccbe24df48 WatchSource:0}: Error finding container b06814bc6d12b0f18d30e0141c32712454349f35db1aee59a7c9a0ccbe24df48: Status 404 returned error can't find the container with id b06814bc6d12b0f18d30e0141c32712454349f35db1aee59a7c9a0ccbe24df48 Dec 03 15:17:52 crc kubenswrapper[4751]: I1203 15:17:52.903304 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66p2v"] Dec 03 15:17:53 crc kubenswrapper[4751]: I1203 15:17:53.105535 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66p2v" event={"ID":"438d1cd9-a242-44f3-bd68-209623426af8","Type":"ContainerStarted","Data":"b06814bc6d12b0f18d30e0141c32712454349f35db1aee59a7c9a0ccbe24df48"} Dec 03 15:17:55 crc kubenswrapper[4751]: I1203 15:17:55.127127 4751 generic.go:334] "Generic (PLEG): container finished" podID="438d1cd9-a242-44f3-bd68-209623426af8" containerID="2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6" exitCode=0 Dec 03 15:17:55 crc kubenswrapper[4751]: I1203 15:17:55.127467 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66p2v" event={"ID":"438d1cd9-a242-44f3-bd68-209623426af8","Type":"ContainerDied","Data":"2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6"} Dec 03 15:17:57 crc kubenswrapper[4751]: I1203 15:17:57.155315 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66p2v" event={"ID":"438d1cd9-a242-44f3-bd68-209623426af8","Type":"ContainerStarted","Data":"eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8"} Dec 03 15:17:58 crc kubenswrapper[4751]: I1203 15:17:58.183574 4751 generic.go:334] "Generic (PLEG): container finished" podID="438d1cd9-a242-44f3-bd68-209623426af8" containerID="eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8" exitCode=0 Dec 03 15:17:58 crc kubenswrapper[4751]: I1203 15:17:58.183959 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66p2v" event={"ID":"438d1cd9-a242-44f3-bd68-209623426af8","Type":"ContainerDied","Data":"eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8"} Dec 03 15:17:59 crc kubenswrapper[4751]: I1203 15:17:59.196375 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66p2v" event={"ID":"438d1cd9-a242-44f3-bd68-209623426af8","Type":"ContainerStarted","Data":"ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848"} Dec 03 15:17:59 crc kubenswrapper[4751]: I1203 15:17:59.314602 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:17:59 crc kubenswrapper[4751]: E1203 15:17:59.314903 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:18:02 crc kubenswrapper[4751]: I1203 15:18:02.326361 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:18:02 crc kubenswrapper[4751]: I1203 15:18:02.326694 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:18:02 crc kubenswrapper[4751]: I1203 15:18:02.379604 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:18:02 crc kubenswrapper[4751]: I1203 15:18:02.403679 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-66p2v" podStartSLOduration=7.969226743 podStartE2EDuration="11.403656416s" podCreationTimestamp="2025-12-03 15:17:51 +0000 UTC" firstStartedPulling="2025-12-03 15:17:55.131481322 +0000 UTC m=+3882.119836539" lastFinishedPulling="2025-12-03 15:17:58.565910995 +0000 UTC m=+3885.554266212" observedRunningTime="2025-12-03 15:17:59.220851547 +0000 UTC m=+3886.209206794" watchObservedRunningTime="2025-12-03 15:18:02.403656416 +0000 UTC m=+3889.392011633" Dec 03 15:18:03 crc kubenswrapper[4751]: I1203 15:18:03.303841 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:18:03 crc kubenswrapper[4751]: I1203 15:18:03.361448 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66p2v"] Dec 03 15:18:05 crc kubenswrapper[4751]: I1203 15:18:05.256544 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-66p2v" podUID="438d1cd9-a242-44f3-bd68-209623426af8" containerName="registry-server" containerID="cri-o://ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848" gracePeriod=2 Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.023670 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.122518 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-utilities\") pod \"438d1cd9-a242-44f3-bd68-209623426af8\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.122670 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxrkx\" (UniqueName: \"kubernetes.io/projected/438d1cd9-a242-44f3-bd68-209623426af8-kube-api-access-mxrkx\") pod \"438d1cd9-a242-44f3-bd68-209623426af8\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.122710 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-catalog-content\") pod \"438d1cd9-a242-44f3-bd68-209623426af8\" (UID: \"438d1cd9-a242-44f3-bd68-209623426af8\") " Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.123508 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-utilities" (OuterVolumeSpecName: "utilities") pod "438d1cd9-a242-44f3-bd68-209623426af8" (UID: "438d1cd9-a242-44f3-bd68-209623426af8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.129675 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438d1cd9-a242-44f3-bd68-209623426af8-kube-api-access-mxrkx" (OuterVolumeSpecName: "kube-api-access-mxrkx") pod "438d1cd9-a242-44f3-bd68-209623426af8" (UID: "438d1cd9-a242-44f3-bd68-209623426af8"). InnerVolumeSpecName "kube-api-access-mxrkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.176095 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "438d1cd9-a242-44f3-bd68-209623426af8" (UID: "438d1cd9-a242-44f3-bd68-209623426af8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.225237 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxrkx\" (UniqueName: \"kubernetes.io/projected/438d1cd9-a242-44f3-bd68-209623426af8-kube-api-access-mxrkx\") on node \"crc\" DevicePath \"\"" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.225285 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.225299 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438d1cd9-a242-44f3-bd68-209623426af8-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.267746 4751 generic.go:334] "Generic (PLEG): container finished" podID="438d1cd9-a242-44f3-bd68-209623426af8" containerID="ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848" exitCode=0 Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.267796 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66p2v" event={"ID":"438d1cd9-a242-44f3-bd68-209623426af8","Type":"ContainerDied","Data":"ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848"} Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.267827 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66p2v" event={"ID":"438d1cd9-a242-44f3-bd68-209623426af8","Type":"ContainerDied","Data":"b06814bc6d12b0f18d30e0141c32712454349f35db1aee59a7c9a0ccbe24df48"} Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.267855 4751 scope.go:117] "RemoveContainer" containerID="ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.268031 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66p2v" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.302382 4751 scope.go:117] "RemoveContainer" containerID="eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.314111 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66p2v"] Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.328445 4751 scope.go:117] "RemoveContainer" containerID="2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.334043 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-66p2v"] Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.391866 4751 scope.go:117] "RemoveContainer" containerID="ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848" Dec 03 15:18:06 crc kubenswrapper[4751]: E1203 15:18:06.392823 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848\": container with ID starting with ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848 not found: ID does not exist" containerID="ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.392867 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848"} err="failed to get container status \"ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848\": rpc error: code = NotFound desc = could not find container \"ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848\": container with ID starting with ee26a1f22545fdeeb76c8bc70b9e91402c7db98a79979c44f616929467388848 not found: ID does not exist" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.392896 4751 scope.go:117] "RemoveContainer" containerID="eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8" Dec 03 15:18:06 crc kubenswrapper[4751]: E1203 15:18:06.393565 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8\": container with ID starting with eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8 not found: ID does not exist" containerID="eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.393592 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8"} err="failed to get container status \"eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8\": rpc error: code = NotFound desc = could not find container \"eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8\": container with ID starting with eb061ac8fa2ca596fb3d79547e85f80735cfca9a40b113bd9d1665c2e4f068e8 not found: ID does not exist" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.393617 4751 scope.go:117] "RemoveContainer" containerID="2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6" Dec 03 15:18:06 crc kubenswrapper[4751]: E1203 15:18:06.393989 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6\": container with ID starting with 2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6 not found: ID does not exist" containerID="2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6" Dec 03 15:18:06 crc kubenswrapper[4751]: I1203 15:18:06.394023 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6"} err="failed to get container status \"2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6\": rpc error: code = NotFound desc = could not find container \"2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6\": container with ID starting with 2a4db3c65c25444bc7a22ee63da702bb443c9b4ba2935805214bcf5f771ff4f6 not found: ID does not exist" Dec 03 15:18:07 crc kubenswrapper[4751]: I1203 15:18:07.326254 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438d1cd9-a242-44f3-bd68-209623426af8" path="/var/lib/kubelet/pods/438d1cd9-a242-44f3-bd68-209623426af8/volumes" Dec 03 15:18:11 crc kubenswrapper[4751]: I1203 15:18:11.313967 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:18:11 crc kubenswrapper[4751]: E1203 15:18:11.314850 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:18:22 crc kubenswrapper[4751]: I1203 15:18:22.768798 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8a7a20c-88be-4cca-a10d-8ac9a898f090/init-config-reloader/0.log" Dec 03 15:18:22 crc kubenswrapper[4751]: I1203 15:18:22.770819 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8a7a20c-88be-4cca-a10d-8ac9a898f090/init-config-reloader/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.162519 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8a7a20c-88be-4cca-a10d-8ac9a898f090/alertmanager/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.170236 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8a7a20c-88be-4cca-a10d-8ac9a898f090/config-reloader/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.193655 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b567465d6-ch8tf_1440ca21-e220-4178-b44c-06672479bc7c/barbican-api/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.332292 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b567465d6-ch8tf_1440ca21-e220-4178-b44c-06672479bc7c/barbican-api-log/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.472465 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c469d65bd-rdqn6_58307992-3054-4b05-b7c6-f768c2a1e849/barbican-keystone-listener/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.656979 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c469d65bd-rdqn6_58307992-3054-4b05-b7c6-f768c2a1e849/barbican-keystone-listener-log/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.683996 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5dcd655495-dj2gs_4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f/barbican-worker/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.694596 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5dcd655495-dj2gs_4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f/barbican-worker-log/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.852485 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k_cd7a02af-abd1-4669-88f3-7e1d1117d8e9/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:23 crc kubenswrapper[4751]: I1203 15:18:23.954786 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/ceilometer-central-agent/1.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.049822 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/ceilometer-notification-agent/1.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.128004 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/ceilometer-central-agent/0.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.212413 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/ceilometer-notification-agent/0.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.227895 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/proxy-httpd/0.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.414341 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/sg-core/0.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.525670 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_afa6e4a5-811b-43db-868b-66a71bff4830/cinder-api-log/0.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.572847 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_afa6e4a5-811b-43db-868b-66a71bff4830/cinder-api/0.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.731040 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_28a23243-c107-4c51-96f0-82db8946b245/cinder-scheduler/0.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.809641 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_28a23243-c107-4c51-96f0-82db8946b245/probe/0.log" Dec 03 15:18:24 crc kubenswrapper[4751]: I1203 15:18:24.990062 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_b5b84593-a4d7-4b1c-843a-feb9273afbf4/cloudkitty-api/0.log" Dec 03 15:18:25 crc kubenswrapper[4751]: I1203 15:18:25.046541 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_b5b84593-a4d7-4b1c-843a-feb9273afbf4/cloudkitty-api-log/0.log" Dec 03 15:18:25 crc kubenswrapper[4751]: I1203 15:18:25.103200 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_34351ff3-ea5e-403c-9d04-ca6777287cff/loki-compactor/0.log" Dec 03 15:18:25 crc kubenswrapper[4751]: I1203 15:18:25.307986 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-56cd74f89f-xg9ml_e2d4448e-9181-494b-bec0-12da338b184d/loki-distributor/0.log" Dec 03 15:18:25 crc kubenswrapper[4751]: I1203 15:18:25.316001 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:18:25 crc kubenswrapper[4751]: E1203 15:18:25.316470 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:18:25 crc kubenswrapper[4751]: I1203 15:18:25.368036 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-8d88b_5a964492-a736-427e-b81a-d6d863d0eaaf/gateway/0.log" Dec 03 15:18:25 crc kubenswrapper[4751]: I1203 15:18:25.693843 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-bl8ws_c1c24fdf-0c9e-458f-9803-87e9d6c3161f/gateway/0.log" Dec 03 15:18:25 crc kubenswrapper[4751]: I1203 15:18:25.751017 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_85623735-2d6a-4d53-ac14-e4cd714ecc7b/loki-index-gateway/0.log" Dec 03 15:18:26 crc kubenswrapper[4751]: I1203 15:18:26.192574 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-779849886d-r2j44_ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa/loki-query-frontend/0.log" Dec 03 15:18:26 crc kubenswrapper[4751]: I1203 15:18:26.379437 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_04053d51-dddf-43e3-a230-9ac729dec435/loki-ingester/0.log" Dec 03 15:18:26 crc kubenswrapper[4751]: I1203 15:18:26.606195 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6_d5de9d69-621e-4336-bd1d-e29c27d29430/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:26 crc kubenswrapper[4751]: I1203 15:18:26.867635 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs_233a8db3-fc65-4c75-81d4-552f44ee95c2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:27 crc kubenswrapper[4751]: I1203 15:18:27.021348 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c4b758ff5-hjq5l_941e6cf3-002b-476c-8347-dfc11a32b067/init/0.log" Dec 03 15:18:27 crc kubenswrapper[4751]: I1203 15:18:27.172507 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c4b758ff5-hjq5l_941e6cf3-002b-476c-8347-dfc11a32b067/init/0.log" Dec 03 15:18:27 crc kubenswrapper[4751]: I1203 15:18:27.300686 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-548665d79b-8226l_4797e85e-ad67-454b-b210-25f5481780c5/loki-querier/0.log" Dec 03 15:18:27 crc kubenswrapper[4751]: I1203 15:18:27.358295 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c4b758ff5-hjq5l_941e6cf3-002b-476c-8347-dfc11a32b067/dnsmasq-dns/0.log" Dec 03 15:18:27 crc kubenswrapper[4751]: I1203 15:18:27.419065 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2_4f6373bc-f6a3-478f-92f5-8e311a5fd86c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:27 crc kubenswrapper[4751]: I1203 15:18:27.590314 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ac247305-666d-4241-b756-88499fd359ad/glance-httpd/0.log" Dec 03 15:18:27 crc kubenswrapper[4751]: I1203 15:18:27.730872 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ac247305-666d-4241-b756-88499fd359ad/glance-log/0.log" Dec 03 15:18:27 crc kubenswrapper[4751]: I1203 15:18:27.980168 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_15b1a633-766f-41f8-b9e8-22acc97bf4c8/glance-httpd/0.log" Dec 03 15:18:27 crc kubenswrapper[4751]: I1203 15:18:27.995730 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_15b1a633-766f-41f8-b9e8-22acc97bf4c8/glance-log/0.log" Dec 03 15:18:28 crc kubenswrapper[4751]: I1203 15:18:28.233022 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg_c1564210-8ace-4588-a706-3c7583ea0568/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:28 crc kubenswrapper[4751]: I1203 15:18:28.247794 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-blkvd_8eeba461-aadb-44d9-ac60-9413a2c70e6d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:28 crc kubenswrapper[4751]: I1203 15:18:28.537039 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412901-h5p7l_bab120ba-b67e-46bf-9d23-359d3119b904/keystone-cron/0.log" Dec 03 15:18:28 crc kubenswrapper[4751]: I1203 15:18:28.790480 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2/kube-state-metrics/0.log" Dec 03 15:18:28 crc kubenswrapper[4751]: I1203 15:18:28.945016 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7f9cb9cd-blmhg_491c0713-5024-484d-921d-387200cb08b2/keystone-api/0.log" Dec 03 15:18:29 crc kubenswrapper[4751]: I1203 15:18:29.069544 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn_607ac64e-604b-407d-9939-b8f2ba0832c5/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:29 crc kubenswrapper[4751]: I1203 15:18:29.510741 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7bc6669df7-xxpxz_97f090c9-1ba2-45b8-9f01-c8372381b095/neutron-httpd/0.log" Dec 03 15:18:29 crc kubenswrapper[4751]: I1203 15:18:29.556457 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7bc6669df7-xxpxz_97f090c9-1ba2-45b8-9f01-c8372381b095/neutron-api/0.log" Dec 03 15:18:30 crc kubenswrapper[4751]: I1203 15:18:30.325410 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll_521c9f69-c59e-4b93-a1a2-ab687b7ee6eb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:30 crc kubenswrapper[4751]: I1203 15:18:30.967227 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d/nova-api-api/0.log" Dec 03 15:18:30 crc kubenswrapper[4751]: I1203 15:18:30.983743 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bac48c68-2c8f-47ff-8f11-7974913dbac1/nova-cell0-conductor-conductor/0.log" Dec 03 15:18:30 crc kubenswrapper[4751]: I1203 15:18:30.993350 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d/nova-api-log/0.log" Dec 03 15:18:31 crc kubenswrapper[4751]: I1203 15:18:31.289864 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a/cloudkitty-proc/0.log" Dec 03 15:18:31 crc kubenswrapper[4751]: I1203 15:18:31.475845 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_730eedc8-ac64-4f53-80d0-ec824459f08c/nova-cell1-conductor-conductor/0.log" Dec 03 15:18:32 crc kubenswrapper[4751]: I1203 15:18:32.215144 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-vgvng_a2deeaad-edf9-4d9c-b116-9a31587b1b2a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:32 crc kubenswrapper[4751]: I1203 15:18:32.355709 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fa35ea33-1dc0-4569-9052-36e722f491c1/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 15:18:32 crc kubenswrapper[4751]: I1203 15:18:32.550271 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_47692780-6643-491b-8d92-c181c82d4ce6/nova-metadata-log/0.log" Dec 03 15:18:32 crc kubenswrapper[4751]: I1203 15:18:32.798737 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_dbe3db84-a6ac-4b03-999b-1d2663641afa/nova-scheduler-scheduler/0.log" Dec 03 15:18:32 crc kubenswrapper[4751]: I1203 15:18:32.888777 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3dc63449-cac9-48bc-abb7-3ff350a408cf/mysql-bootstrap/0.log" Dec 03 15:18:33 crc kubenswrapper[4751]: I1203 15:18:33.161476 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3dc63449-cac9-48bc-abb7-3ff350a408cf/mysql-bootstrap/0.log" Dec 03 15:18:33 crc kubenswrapper[4751]: I1203 15:18:33.164463 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3dc63449-cac9-48bc-abb7-3ff350a408cf/galera/0.log" Dec 03 15:18:33 crc kubenswrapper[4751]: I1203 15:18:33.492521 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a45965be-01f0-4c6d-9db8-08b5e5564c5a/mysql-bootstrap/0.log" Dec 03 15:18:33 crc kubenswrapper[4751]: I1203 15:18:33.629711 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a45965be-01f0-4c6d-9db8-08b5e5564c5a/mysql-bootstrap/0.log" Dec 03 15:18:33 crc kubenswrapper[4751]: I1203 15:18:33.635316 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a45965be-01f0-4c6d-9db8-08b5e5564c5a/galera/0.log" Dec 03 15:18:33 crc kubenswrapper[4751]: I1203 15:18:33.882862 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_47692780-6643-491b-8d92-c181c82d4ce6/nova-metadata-metadata/0.log" Dec 03 15:18:33 crc kubenswrapper[4751]: I1203 15:18:33.937957 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_033a5c7c-11ef-4610-ac41-aa8471a9f0b4/openstackclient/0.log" Dec 03 15:18:34 crc kubenswrapper[4751]: I1203 15:18:34.106906 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lqzrd_7ab1fa90-b8eb-405d-803d-b9fd84939289/ovn-controller/0.log" Dec 03 15:18:34 crc kubenswrapper[4751]: I1203 15:18:34.203769 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-98hst_aa72a067-0544-4a0c-8750-c3d76221d4f2/openstack-network-exporter/0.log" Dec 03 15:18:34 crc kubenswrapper[4751]: I1203 15:18:34.321236 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzz9c_3faee7be-8b53-42b6-90fd-ba62998f9ced/ovsdb-server-init/0.log" Dec 03 15:18:34 crc kubenswrapper[4751]: I1203 15:18:34.596785 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzz9c_3faee7be-8b53-42b6-90fd-ba62998f9ced/ovsdb-server-init/0.log" Dec 03 15:18:34 crc kubenswrapper[4751]: I1203 15:18:34.706087 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzz9c_3faee7be-8b53-42b6-90fd-ba62998f9ced/ovs-vswitchd/0.log" Dec 03 15:18:34 crc kubenswrapper[4751]: I1203 15:18:34.731243 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzz9c_3faee7be-8b53-42b6-90fd-ba62998f9ced/ovsdb-server/0.log" Dec 03 15:18:34 crc kubenswrapper[4751]: I1203 15:18:34.899900 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-h82lf_5d9c6feb-6018-476a-b029-e4df05b4566d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:35 crc kubenswrapper[4751]: I1203 15:18:35.071789 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e7ffebbc-a033-4a04-a133-d90456a57881/ovn-northd/0.log" Dec 03 15:18:35 crc kubenswrapper[4751]: I1203 15:18:35.072080 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e7ffebbc-a033-4a04-a133-d90456a57881/openstack-network-exporter/0.log" Dec 03 15:18:35 crc kubenswrapper[4751]: I1203 15:18:35.289794 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b/ovsdbserver-nb/0.log" Dec 03 15:18:35 crc kubenswrapper[4751]: I1203 15:18:35.400783 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b/openstack-network-exporter/0.log" Dec 03 15:18:35 crc kubenswrapper[4751]: I1203 15:18:35.566694 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0046f111-cf94-402b-8981-659978aace04/openstack-network-exporter/0.log" Dec 03 15:18:35 crc kubenswrapper[4751]: I1203 15:18:35.574170 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0046f111-cf94-402b-8981-659978aace04/ovsdbserver-sb/0.log" Dec 03 15:18:35 crc kubenswrapper[4751]: I1203 15:18:35.855435 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-cfc86c59b-x4m2l_bfdc9703-a6a9-4a1d-81f4-852aa9167a17/placement-api/0.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.011501 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/init-config-reloader/0.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.061174 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-cfc86c59b-x4m2l_bfdc9703-a6a9-4a1d-81f4-852aa9167a17/placement-log/0.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.330908 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/init-config-reloader/0.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.342088 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/config-reloader/0.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.409669 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/prometheus/1.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.412109 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/prometheus/0.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.580081 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/thanos-sidecar/0.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.711748 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d5fd5425-70e4-4a79-8ea7-3326cae3908d/setup-container/0.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.903167 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d5fd5425-70e4-4a79-8ea7-3326cae3908d/rabbitmq/0.log" Dec 03 15:18:36 crc kubenswrapper[4751]: I1203 15:18:36.984599 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d5fd5425-70e4-4a79-8ea7-3326cae3908d/setup-container/0.log" Dec 03 15:18:37 crc kubenswrapper[4751]: I1203 15:18:37.034754 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4760c776-9212-42af-8bf2-928c79417922/setup-container/0.log" Dec 03 15:18:37 crc kubenswrapper[4751]: I1203 15:18:37.291759 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4760c776-9212-42af-8bf2-928c79417922/setup-container/0.log" Dec 03 15:18:37 crc kubenswrapper[4751]: I1203 15:18:37.297991 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4760c776-9212-42af-8bf2-928c79417922/rabbitmq/0.log" Dec 03 15:18:37 crc kubenswrapper[4751]: I1203 15:18:37.480826 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk_a08bb04e-0d05-4153-ab50-9fde15bb421b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:37 crc kubenswrapper[4751]: I1203 15:18:37.590402 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-w96tn_228ac9f7-0635-4a38-8d51-038e9a588a7d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:37 crc kubenswrapper[4751]: I1203 15:18:37.735264 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt_f38ae118-11a0-4c72-9d0b-750762779ee7/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:37 crc kubenswrapper[4751]: I1203 15:18:37.900415 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-2gj64_478701d4-170a-4043-97d6-6b54b753a72a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:37 crc kubenswrapper[4751]: I1203 15:18:37.981993 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-6xdd9_b8584e24-f4eb-400e-a73c-610ba6fe3a41/ssh-known-hosts-edpm-deployment/0.log" Dec 03 15:18:38 crc kubenswrapper[4751]: I1203 15:18:38.307246 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-84ff798d87-5c96l_9c7e0fc7-03ed-4002-b460-df87d151f563/proxy-httpd/0.log" Dec 03 15:18:38 crc kubenswrapper[4751]: I1203 15:18:38.336365 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-84ff798d87-5c96l_9c7e0fc7-03ed-4002-b460-df87d151f563/proxy-server/0.log" Dec 03 15:18:38 crc kubenswrapper[4751]: I1203 15:18:38.433670 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-68rzl_0fedaa81-0c36-44fa-ab7b-b712759fc8d4/swift-ring-rebalance/0.log" Dec 03 15:18:38 crc kubenswrapper[4751]: I1203 15:18:38.624484 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/account-auditor/0.log" Dec 03 15:18:38 crc kubenswrapper[4751]: I1203 15:18:38.728442 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/account-reaper/0.log" Dec 03 15:18:38 crc kubenswrapper[4751]: I1203 15:18:38.814623 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/account-replicator/0.log" Dec 03 15:18:38 crc kubenswrapper[4751]: I1203 15:18:38.904312 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/account-server/0.log" Dec 03 15:18:38 crc kubenswrapper[4751]: I1203 15:18:38.957106 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/container-auditor/0.log" Dec 03 15:18:39 crc kubenswrapper[4751]: I1203 15:18:39.016400 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/container-replicator/0.log" Dec 03 15:18:39 crc kubenswrapper[4751]: I1203 15:18:39.073072 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/container-server/0.log" Dec 03 15:18:39 crc kubenswrapper[4751]: I1203 15:18:39.168809 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/container-updater/0.log" Dec 03 15:18:39 crc kubenswrapper[4751]: I1203 15:18:39.220316 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-auditor/0.log" Dec 03 15:18:39 crc kubenswrapper[4751]: I1203 15:18:39.300910 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-replicator/0.log" Dec 03 15:18:39 crc kubenswrapper[4751]: I1203 15:18:39.371870 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-expirer/0.log" Dec 03 15:18:39 crc kubenswrapper[4751]: I1203 15:18:39.418846 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-server/0.log" Dec 03 15:18:39 crc kubenswrapper[4751]: I1203 15:18:39.452650 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-updater/0.log" Dec 03 15:18:40 crc kubenswrapper[4751]: I1203 15:18:40.198796 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/rsync/0.log" Dec 03 15:18:40 crc kubenswrapper[4751]: I1203 15:18:40.201767 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/swift-recon-cron/0.log" Dec 03 15:18:40 crc kubenswrapper[4751]: I1203 15:18:40.313677 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:18:40 crc kubenswrapper[4751]: E1203 15:18:40.314188 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:18:40 crc kubenswrapper[4751]: I1203 15:18:40.403442 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l_052552e4-436a-4f3e-a7cd-cacb72ff4f16/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:40 crc kubenswrapper[4751]: I1203 15:18:40.457169 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_aa32bbce-059c-46e3-a8d7-f737d93e394e/tempest-tests-tempest-tests-runner/0.log" Dec 03 15:18:40 crc kubenswrapper[4751]: I1203 15:18:40.652113 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_14cd5bf7-18fc-450b-b4b1-f87bf154efeb/test-operator-logs-container/0.log" Dec 03 15:18:40 crc kubenswrapper[4751]: I1203 15:18:40.925250 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z_ae9b241a-73d5-4f1c-b14d-f7b44cc008f1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:18:47 crc kubenswrapper[4751]: I1203 15:18:47.016543 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_05d18e1b-04cd-4b4a-a728-bdbc9c2ab713/memcached/0.log" Dec 03 15:18:53 crc kubenswrapper[4751]: I1203 15:18:53.323189 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:18:53 crc kubenswrapper[4751]: E1203 15:18:53.324150 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:19:07 crc kubenswrapper[4751]: I1203 15:19:07.314884 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:19:07 crc kubenswrapper[4751]: E1203 15:19:07.315798 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:19:16 crc kubenswrapper[4751]: I1203 15:19:16.540031 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/util/0.log" Dec 03 15:19:16 crc kubenswrapper[4751]: I1203 15:19:16.883431 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/pull/0.log" Dec 03 15:19:16 crc kubenswrapper[4751]: I1203 15:19:16.979204 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/pull/0.log" Dec 03 15:19:17 crc kubenswrapper[4751]: I1203 15:19:17.005863 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/util/0.log" Dec 03 15:19:17 crc kubenswrapper[4751]: I1203 15:19:17.286487 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/pull/0.log" Dec 03 15:19:17 crc kubenswrapper[4751]: I1203 15:19:17.293442 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/util/0.log" Dec 03 15:19:17 crc kubenswrapper[4751]: I1203 15:19:17.321724 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/extract/0.log" Dec 03 15:19:17 crc kubenswrapper[4751]: I1203 15:19:17.498492 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qswcx_26689286-a791-485e-b442-9e399ae7a79b/kube-rbac-proxy/0.log" Dec 03 15:19:17 crc kubenswrapper[4751]: I1203 15:19:17.521074 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qswcx_26689286-a791-485e-b442-9e399ae7a79b/manager/0.log" Dec 03 15:19:17 crc kubenswrapper[4751]: I1203 15:19:17.682973 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-7422h_b112bf8e-175b-4bc3-9840-6d134b4a1bce/kube-rbac-proxy/0.log" Dec 03 15:19:17 crc kubenswrapper[4751]: I1203 15:19:17.876896 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-7422h_b112bf8e-175b-4bc3-9840-6d134b4a1bce/manager/0.log" Dec 03 15:19:18 crc kubenswrapper[4751]: I1203 15:19:18.063926 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-n8td8_618af04c-a37d-4d21-bdba-345c9a63be07/kube-rbac-proxy/0.log" Dec 03 15:19:18 crc kubenswrapper[4751]: I1203 15:19:18.073757 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-n8td8_618af04c-a37d-4d21-bdba-345c9a63be07/manager/0.log" Dec 03 15:19:18 crc kubenswrapper[4751]: I1203 15:19:18.481666 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-6shnx_a975003d-b7d2-4a95-8571-571bc082021d/kube-rbac-proxy/0.log" Dec 03 15:19:18 crc kubenswrapper[4751]: I1203 15:19:18.665883 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-6shnx_a975003d-b7d2-4a95-8571-571bc082021d/manager/0.log" Dec 03 15:19:18 crc kubenswrapper[4751]: I1203 15:19:18.736816 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-m4q9z_ff012f7f-3431-472a-8b44-1fa7a47e74e1/kube-rbac-proxy/0.log" Dec 03 15:19:18 crc kubenswrapper[4751]: I1203 15:19:18.791748 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-m4q9z_ff012f7f-3431-472a-8b44-1fa7a47e74e1/manager/0.log" Dec 03 15:19:19 crc kubenswrapper[4751]: I1203 15:19:19.008032 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8hhdk_b5d6b394-fe97-4e70-9916-9c6791379931/kube-rbac-proxy/0.log" Dec 03 15:19:19 crc kubenswrapper[4751]: I1203 15:19:19.044003 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8hhdk_b5d6b394-fe97-4e70-9916-9c6791379931/manager/1.log" Dec 03 15:19:19 crc kubenswrapper[4751]: I1203 15:19:19.140085 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8hhdk_b5d6b394-fe97-4e70-9916-9c6791379931/manager/0.log" Dec 03 15:19:19 crc kubenswrapper[4751]: I1203 15:19:19.314063 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:19:19 crc kubenswrapper[4751]: E1203 15:19:19.314400 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:19:19 crc kubenswrapper[4751]: I1203 15:19:19.378815 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-ppb75_a54985ea-4d23-4a65-bd1a-1c9d059ea206/kube-rbac-proxy/0.log" Dec 03 15:19:19 crc kubenswrapper[4751]: I1203 15:19:19.519635 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-ppb75_a54985ea-4d23-4a65-bd1a-1c9d059ea206/manager/0.log" Dec 03 15:19:19 crc kubenswrapper[4751]: I1203 15:19:19.535976 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-pxjk5_2f31e262-8f03-4689-bc29-5d9d8b33a2cc/kube-rbac-proxy/0.log" Dec 03 15:19:19 crc kubenswrapper[4751]: I1203 15:19:19.888286 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-pxjk5_2f31e262-8f03-4689-bc29-5d9d8b33a2cc/manager/1.log" Dec 03 15:19:20 crc kubenswrapper[4751]: I1203 15:19:20.042141 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-pxjk5_2f31e262-8f03-4689-bc29-5d9d8b33a2cc/manager/0.log" Dec 03 15:19:20 crc kubenswrapper[4751]: I1203 15:19:20.049303 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-wgjr8_7f29786e-1f3c-4c92-81ac-4b6110cf03a3/kube-rbac-proxy/0.log" Dec 03 15:19:20 crc kubenswrapper[4751]: I1203 15:19:20.307113 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-wgjr8_7f29786e-1f3c-4c92-81ac-4b6110cf03a3/manager/0.log" Dec 03 15:19:20 crc kubenswrapper[4751]: I1203 15:19:20.449513 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zjnwr_8adbadf1-f21d-4a09-acf7-d44a87bee356/kube-rbac-proxy/0.log" Dec 03 15:19:20 crc kubenswrapper[4751]: I1203 15:19:20.535519 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zjnwr_8adbadf1-f21d-4a09-acf7-d44a87bee356/manager/0.log" Dec 03 15:19:21 crc kubenswrapper[4751]: I1203 15:19:21.485774 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4nvpk_b4cb50e3-a93e-49b0-ac9c-6551046dc0be/manager/0.log" Dec 03 15:19:21 crc kubenswrapper[4751]: I1203 15:19:21.565351 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4nvpk_b4cb50e3-a93e-49b0-ac9c-6551046dc0be/kube-rbac-proxy/0.log" Dec 03 15:19:21 crc kubenswrapper[4751]: I1203 15:19:21.610790 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-sxl98_4cd34243-8404-4cf7-9185-c012700b5814/kube-rbac-proxy/0.log" Dec 03 15:19:21 crc kubenswrapper[4751]: I1203 15:19:21.749898 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-sxl98_4cd34243-8404-4cf7-9185-c012700b5814/manager/0.log" Dec 03 15:19:21 crc kubenswrapper[4751]: I1203 15:19:21.824044 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7tsfx_cbafd52a-d603-4b8c-a056-9a2a749bee21/kube-rbac-proxy/0.log" Dec 03 15:19:21 crc kubenswrapper[4751]: I1203 15:19:21.888138 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7tsfx_cbafd52a-d603-4b8c-a056-9a2a749bee21/manager/0.log" Dec 03 15:19:22 crc kubenswrapper[4751]: I1203 15:19:22.111566 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-8h62v_5c3add92-6cee-4980-903f-692cfd4cf87c/kube-rbac-proxy/0.log" Dec 03 15:19:22 crc kubenswrapper[4751]: I1203 15:19:22.316378 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b_17b09c23-21ca-4060-840d-acbf71e22d55/kube-rbac-proxy/0.log" Dec 03 15:19:22 crc kubenswrapper[4751]: I1203 15:19:22.500412 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b_17b09c23-21ca-4060-840d-acbf71e22d55/manager/1.log" Dec 03 15:19:22 crc kubenswrapper[4751]: I1203 15:19:22.853108 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-8h62v_5c3add92-6cee-4980-903f-692cfd4cf87c/manager/1.log" Dec 03 15:19:22 crc kubenswrapper[4751]: I1203 15:19:22.853507 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b_17b09c23-21ca-4060-840d-acbf71e22d55/manager/0.log" Dec 03 15:19:23 crc kubenswrapper[4751]: I1203 15:19:23.004806 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-8h62v_5c3add92-6cee-4980-903f-692cfd4cf87c/manager/0.log" Dec 03 15:19:23 crc kubenswrapper[4751]: I1203 15:19:23.723024 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-698cb7586c-qft9p_6aeb43b5-b817-4d39-81de-bc6f27afb55b/operator/0.log" Dec 03 15:19:24 crc kubenswrapper[4751]: I1203 15:19:24.517091 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jqt9t_8a1c208b-28d2-4d51-a98e-ffece8c3d11e/registry-server/0.log" Dec 03 15:19:24 crc kubenswrapper[4751]: I1203 15:19:24.598868 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-k5j8b_75a825ba-e08d-440f-866d-d32d2ae812f1/kube-rbac-proxy/0.log" Dec 03 15:19:24 crc kubenswrapper[4751]: I1203 15:19:24.732062 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-k5j8b_75a825ba-e08d-440f-866d-d32d2ae812f1/manager/0.log" Dec 03 15:19:24 crc kubenswrapper[4751]: I1203 15:19:24.873138 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7f9b9ccb84-v5t4x_7a4eb3e2-25fa-43e4-9e49-135b5c087014/manager/0.log" Dec 03 15:19:24 crc kubenswrapper[4751]: I1203 15:19:24.891749 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gsp46_5434f233-b204-4db9-a93d-93d4342e4514/kube-rbac-proxy/0.log" Dec 03 15:19:25 crc kubenswrapper[4751]: I1203 15:19:25.018909 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gsp46_5434f233-b204-4db9-a93d-93d4342e4514/manager/0.log" Dec 03 15:19:25 crc kubenswrapper[4751]: I1203 15:19:25.144601 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4thwz_ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d/operator/0.log" Dec 03 15:19:25 crc kubenswrapper[4751]: I1203 15:19:25.357735 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-w9fcj_81e287a7-6973-4561-a67a-a8783b0cedf5/kube-rbac-proxy/0.log" Dec 03 15:19:25 crc kubenswrapper[4751]: I1203 15:19:25.422967 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-w9fcj_81e287a7-6973-4561-a67a-a8783b0cedf5/manager/0.log" Dec 03 15:19:25 crc kubenswrapper[4751]: I1203 15:19:25.664439 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-9598fff97-l6gxb_2011fe70-e44a-4b63-8064-e3234a639fb8/kube-rbac-proxy/0.log" Dec 03 15:19:25 crc kubenswrapper[4751]: I1203 15:19:25.937391 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lgjvh_adff5e75-192d-4a27-a477-aa74dab8dd95/kube-rbac-proxy/0.log" Dec 03 15:19:26 crc kubenswrapper[4751]: I1203 15:19:26.060993 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lgjvh_adff5e75-192d-4a27-a477-aa74dab8dd95/manager/0.log" Dec 03 15:19:26 crc kubenswrapper[4751]: I1203 15:19:26.082273 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lgjvh_adff5e75-192d-4a27-a477-aa74dab8dd95/manager/1.log" Dec 03 15:19:26 crc kubenswrapper[4751]: I1203 15:19:26.199099 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-9598fff97-l6gxb_2011fe70-e44a-4b63-8064-e3234a639fb8/manager/0.log" Dec 03 15:19:26 crc kubenswrapper[4751]: I1203 15:19:26.266877 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-qgvqn_5950fcf6-2983-4341-ba48-12c27801a57e/kube-rbac-proxy/0.log" Dec 03 15:19:26 crc kubenswrapper[4751]: I1203 15:19:26.316706 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-qgvqn_5950fcf6-2983-4341-ba48-12c27801a57e/manager/0.log" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.362607 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9w22"] Dec 03 15:19:28 crc kubenswrapper[4751]: E1203 15:19:28.364006 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438d1cd9-a242-44f3-bd68-209623426af8" containerName="extract-content" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.364026 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="438d1cd9-a242-44f3-bd68-209623426af8" containerName="extract-content" Dec 03 15:19:28 crc kubenswrapper[4751]: E1203 15:19:28.364052 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438d1cd9-a242-44f3-bd68-209623426af8" containerName="registry-server" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.364060 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="438d1cd9-a242-44f3-bd68-209623426af8" containerName="registry-server" Dec 03 15:19:28 crc kubenswrapper[4751]: E1203 15:19:28.364115 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438d1cd9-a242-44f3-bd68-209623426af8" containerName="extract-utilities" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.364127 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="438d1cd9-a242-44f3-bd68-209623426af8" containerName="extract-utilities" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.364427 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="438d1cd9-a242-44f3-bd68-209623426af8" containerName="registry-server" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.366783 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.375810 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9w22"] Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.399819 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-utilities\") pod \"community-operators-f9w22\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.399982 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-catalog-content\") pod \"community-operators-f9w22\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.400054 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2s4k\" (UniqueName: \"kubernetes.io/projected/69079e90-e430-4f1c-837d-e178404465f5-kube-api-access-s2s4k\") pod \"community-operators-f9w22\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.501753 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-utilities\") pod \"community-operators-f9w22\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.501859 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-catalog-content\") pod \"community-operators-f9w22\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.501910 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2s4k\" (UniqueName: \"kubernetes.io/projected/69079e90-e430-4f1c-837d-e178404465f5-kube-api-access-s2s4k\") pod \"community-operators-f9w22\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.502251 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-utilities\") pod \"community-operators-f9w22\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.502458 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-catalog-content\") pod \"community-operators-f9w22\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.523183 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2s4k\" (UniqueName: \"kubernetes.io/projected/69079e90-e430-4f1c-837d-e178404465f5-kube-api-access-s2s4k\") pod \"community-operators-f9w22\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:28 crc kubenswrapper[4751]: I1203 15:19:28.689490 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:29 crc kubenswrapper[4751]: W1203 15:19:29.255388 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69079e90_e430_4f1c_837d_e178404465f5.slice/crio-7e3b7f9cd5b27c3e745519f43ea451e86cd9a14655638d56139971f81ce80a42 WatchSource:0}: Error finding container 7e3b7f9cd5b27c3e745519f43ea451e86cd9a14655638d56139971f81ce80a42: Status 404 returned error can't find the container with id 7e3b7f9cd5b27c3e745519f43ea451e86cd9a14655638d56139971f81ce80a42 Dec 03 15:19:29 crc kubenswrapper[4751]: I1203 15:19:29.256887 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9w22"] Dec 03 15:19:30 crc kubenswrapper[4751]: I1203 15:19:30.167701 4751 generic.go:334] "Generic (PLEG): container finished" podID="69079e90-e430-4f1c-837d-e178404465f5" containerID="1d824b989243bd3cce9e81f96728a830988bb2164a50a077e2dd852ed1c25743" exitCode=0 Dec 03 15:19:30 crc kubenswrapper[4751]: I1203 15:19:30.167781 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9w22" event={"ID":"69079e90-e430-4f1c-837d-e178404465f5","Type":"ContainerDied","Data":"1d824b989243bd3cce9e81f96728a830988bb2164a50a077e2dd852ed1c25743"} Dec 03 15:19:30 crc kubenswrapper[4751]: I1203 15:19:30.168003 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9w22" event={"ID":"69079e90-e430-4f1c-837d-e178404465f5","Type":"ContainerStarted","Data":"7e3b7f9cd5b27c3e745519f43ea451e86cd9a14655638d56139971f81ce80a42"} Dec 03 15:19:32 crc kubenswrapper[4751]: I1203 15:19:32.188615 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9w22" event={"ID":"69079e90-e430-4f1c-837d-e178404465f5","Type":"ContainerStarted","Data":"877f026115ebed6296919a54657f2f9eed4305767c50e7960da886b9625d8699"} Dec 03 15:19:33 crc kubenswrapper[4751]: I1203 15:19:33.360694 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:19:33 crc kubenswrapper[4751]: E1203 15:19:33.361240 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:19:34 crc kubenswrapper[4751]: I1203 15:19:34.211128 4751 generic.go:334] "Generic (PLEG): container finished" podID="69079e90-e430-4f1c-837d-e178404465f5" containerID="877f026115ebed6296919a54657f2f9eed4305767c50e7960da886b9625d8699" exitCode=0 Dec 03 15:19:34 crc kubenswrapper[4751]: I1203 15:19:34.211174 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9w22" event={"ID":"69079e90-e430-4f1c-837d-e178404465f5","Type":"ContainerDied","Data":"877f026115ebed6296919a54657f2f9eed4305767c50e7960da886b9625d8699"} Dec 03 15:19:35 crc kubenswrapper[4751]: I1203 15:19:35.224471 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9w22" event={"ID":"69079e90-e430-4f1c-837d-e178404465f5","Type":"ContainerStarted","Data":"9e7e1637434c960dce88b34cf3f6f5564db88c8635acf877b7a4b67b3a5ce86f"} Dec 03 15:19:38 crc kubenswrapper[4751]: I1203 15:19:38.690295 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:38 crc kubenswrapper[4751]: I1203 15:19:38.690760 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:38 crc kubenswrapper[4751]: I1203 15:19:38.741034 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:38 crc kubenswrapper[4751]: I1203 15:19:38.762423 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9w22" podStartSLOduration=6.269467992 podStartE2EDuration="10.762404235s" podCreationTimestamp="2025-12-03 15:19:28 +0000 UTC" firstStartedPulling="2025-12-03 15:19:30.171150269 +0000 UTC m=+3977.159505486" lastFinishedPulling="2025-12-03 15:19:34.664086512 +0000 UTC m=+3981.652441729" observedRunningTime="2025-12-03 15:19:35.248091015 +0000 UTC m=+3982.236446242" watchObservedRunningTime="2025-12-03 15:19:38.762404235 +0000 UTC m=+3985.750759452" Dec 03 15:19:39 crc kubenswrapper[4751]: I1203 15:19:39.381895 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:39 crc kubenswrapper[4751]: I1203 15:19:39.478022 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9w22"] Dec 03 15:19:41 crc kubenswrapper[4751]: I1203 15:19:41.286273 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f9w22" podUID="69079e90-e430-4f1c-837d-e178404465f5" containerName="registry-server" containerID="cri-o://9e7e1637434c960dce88b34cf3f6f5564db88c8635acf877b7a4b67b3a5ce86f" gracePeriod=2 Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.302977 4751 generic.go:334] "Generic (PLEG): container finished" podID="69079e90-e430-4f1c-837d-e178404465f5" containerID="9e7e1637434c960dce88b34cf3f6f5564db88c8635acf877b7a4b67b3a5ce86f" exitCode=0 Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.303489 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9w22" event={"ID":"69079e90-e430-4f1c-837d-e178404465f5","Type":"ContainerDied","Data":"9e7e1637434c960dce88b34cf3f6f5564db88c8635acf877b7a4b67b3a5ce86f"} Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.416613 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sj26z"] Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.446492 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.467815 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sj26z"] Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.511622 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-utilities\") pod \"redhat-operators-sj26z\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.511725 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvcd\" (UniqueName: \"kubernetes.io/projected/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-kube-api-access-7gvcd\") pod \"redhat-operators-sj26z\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.511774 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-catalog-content\") pod \"redhat-operators-sj26z\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.613589 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvcd\" (UniqueName: \"kubernetes.io/projected/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-kube-api-access-7gvcd\") pod \"redhat-operators-sj26z\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.613708 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-catalog-content\") pod \"redhat-operators-sj26z\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.613845 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-utilities\") pod \"redhat-operators-sj26z\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.614253 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-catalog-content\") pod \"redhat-operators-sj26z\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.614460 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-utilities\") pod \"redhat-operators-sj26z\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.639321 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvcd\" (UniqueName: \"kubernetes.io/projected/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-kube-api-access-7gvcd\") pod \"redhat-operators-sj26z\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.804163 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.839539 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.918973 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-catalog-content\") pod \"69079e90-e430-4f1c-837d-e178404465f5\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.944960 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2s4k\" (UniqueName: \"kubernetes.io/projected/69079e90-e430-4f1c-837d-e178404465f5-kube-api-access-s2s4k\") pod \"69079e90-e430-4f1c-837d-e178404465f5\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.945072 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-utilities\") pod \"69079e90-e430-4f1c-837d-e178404465f5\" (UID: \"69079e90-e430-4f1c-837d-e178404465f5\") " Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.961912 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-utilities" (OuterVolumeSpecName: "utilities") pod "69079e90-e430-4f1c-837d-e178404465f5" (UID: "69079e90-e430-4f1c-837d-e178404465f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:19:42 crc kubenswrapper[4751]: I1203 15:19:42.971544 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69079e90-e430-4f1c-837d-e178404465f5-kube-api-access-s2s4k" (OuterVolumeSpecName: "kube-api-access-s2s4k") pod "69079e90-e430-4f1c-837d-e178404465f5" (UID: "69079e90-e430-4f1c-837d-e178404465f5"). InnerVolumeSpecName "kube-api-access-s2s4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.050805 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2s4k\" (UniqueName: \"kubernetes.io/projected/69079e90-e430-4f1c-837d-e178404465f5-kube-api-access-s2s4k\") on node \"crc\" DevicePath \"\"" Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.051128 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.094737 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69079e90-e430-4f1c-837d-e178404465f5" (UID: "69079e90-e430-4f1c-837d-e178404465f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.152790 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69079e90-e430-4f1c-837d-e178404465f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.331772 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9w22" Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.338977 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9w22" event={"ID":"69079e90-e430-4f1c-837d-e178404465f5","Type":"ContainerDied","Data":"7e3b7f9cd5b27c3e745519f43ea451e86cd9a14655638d56139971f81ce80a42"} Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.339066 4751 scope.go:117] "RemoveContainer" containerID="9e7e1637434c960dce88b34cf3f6f5564db88c8635acf877b7a4b67b3a5ce86f" Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.399915 4751 scope.go:117] "RemoveContainer" containerID="877f026115ebed6296919a54657f2f9eed4305767c50e7960da886b9625d8699" Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.410911 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9w22"] Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.470711 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f9w22"] Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.497872 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sj26z"] Dec 03 15:19:43 crc kubenswrapper[4751]: I1203 15:19:43.967684 4751 scope.go:117] "RemoveContainer" containerID="1d824b989243bd3cce9e81f96728a830988bb2164a50a077e2dd852ed1c25743" Dec 03 15:19:44 crc kubenswrapper[4751]: I1203 15:19:44.345185 4751 generic.go:334] "Generic (PLEG): container finished" podID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerID="1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6" exitCode=0 Dec 03 15:19:44 crc kubenswrapper[4751]: I1203 15:19:44.345228 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj26z" event={"ID":"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea","Type":"ContainerDied","Data":"1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6"} Dec 03 15:19:44 crc kubenswrapper[4751]: I1203 15:19:44.345462 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj26z" event={"ID":"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea","Type":"ContainerStarted","Data":"e1fa35c2b7aaeda4b8a14a68c9c737d7acec720be7316ad9b6f49ce0560ca118"} Dec 03 15:19:45 crc kubenswrapper[4751]: I1203 15:19:45.330863 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69079e90-e430-4f1c-837d-e178404465f5" path="/var/lib/kubelet/pods/69079e90-e430-4f1c-837d-e178404465f5/volumes" Dec 03 15:19:45 crc kubenswrapper[4751]: I1203 15:19:45.359928 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj26z" event={"ID":"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea","Type":"ContainerStarted","Data":"df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7"} Dec 03 15:19:46 crc kubenswrapper[4751]: I1203 15:19:46.314190 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:19:46 crc kubenswrapper[4751]: E1203 15:19:46.314793 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:19:49 crc kubenswrapper[4751]: E1203 15:19:49.359579 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0eb6cb0_2d2b_40d8_84e4_b65008a00cea.slice/crio-conmon-df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7.scope\": RecentStats: unable to find data in memory cache]" Dec 03 15:19:49 crc kubenswrapper[4751]: I1203 15:19:49.434304 4751 generic.go:334] "Generic (PLEG): container finished" podID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerID="df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7" exitCode=0 Dec 03 15:19:49 crc kubenswrapper[4751]: I1203 15:19:49.434368 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj26z" event={"ID":"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea","Type":"ContainerDied","Data":"df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7"} Dec 03 15:19:50 crc kubenswrapper[4751]: I1203 15:19:50.447287 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj26z" event={"ID":"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea","Type":"ContainerStarted","Data":"896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665"} Dec 03 15:19:50 crc kubenswrapper[4751]: I1203 15:19:50.477960 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sj26z" podStartSLOduration=2.930708147 podStartE2EDuration="8.477935679s" podCreationTimestamp="2025-12-03 15:19:42 +0000 UTC" firstStartedPulling="2025-12-03 15:19:44.346879908 +0000 UTC m=+3991.335235125" lastFinishedPulling="2025-12-03 15:19:49.89410744 +0000 UTC m=+3996.882462657" observedRunningTime="2025-12-03 15:19:50.467488741 +0000 UTC m=+3997.455843958" watchObservedRunningTime="2025-12-03 15:19:50.477935679 +0000 UTC m=+3997.466290896" Dec 03 15:19:52 crc kubenswrapper[4751]: I1203 15:19:52.804702 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:52 crc kubenswrapper[4751]: I1203 15:19:52.805460 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:19:53 crc kubenswrapper[4751]: I1203 15:19:53.860842 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sj26z" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="registry-server" probeResult="failure" output=< Dec 03 15:19:53 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 15:19:53 crc kubenswrapper[4751]: > Dec 03 15:19:56 crc kubenswrapper[4751]: I1203 15:19:56.521616 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pppn7_89c33472-8c62-4a71-9b17-697f9a0bbc65/control-plane-machine-set-operator/0.log" Dec 03 15:19:56 crc kubenswrapper[4751]: I1203 15:19:56.731922 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2n4v9_ab66f440-24ed-4244-a972-63eee27b67b1/kube-rbac-proxy/0.log" Dec 03 15:19:56 crc kubenswrapper[4751]: I1203 15:19:56.813721 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2n4v9_ab66f440-24ed-4244-a972-63eee27b67b1/machine-api-operator/0.log" Dec 03 15:20:01 crc kubenswrapper[4751]: I1203 15:20:01.317755 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:20:01 crc kubenswrapper[4751]: E1203 15:20:01.318440 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:20:04 crc kubenswrapper[4751]: I1203 15:20:04.306581 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sj26z" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="registry-server" probeResult="failure" output=< Dec 03 15:20:04 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 15:20:04 crc kubenswrapper[4751]: > Dec 03 15:20:12 crc kubenswrapper[4751]: I1203 15:20:12.866346 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:20:12 crc kubenswrapper[4751]: I1203 15:20:12.907401 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-wb55f_6bc5bb21-6b5f-4b06-a96a-2e5883752c9a/cert-manager-controller/0.log" Dec 03 15:20:12 crc kubenswrapper[4751]: I1203 15:20:12.937890 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:20:13 crc kubenswrapper[4751]: I1203 15:20:13.038065 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-rwllx_c09475fb-946d-45c4-8482-4db508ae7459/cert-manager-cainjector/0.log" Dec 03 15:20:13 crc kubenswrapper[4751]: I1203 15:20:13.145840 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-7jwb7_d42519d9-b7e8-4c0b-bcac-b4269faf605a/cert-manager-webhook/0.log" Dec 03 15:20:13 crc kubenswrapper[4751]: I1203 15:20:13.355391 4751 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod69079e90-e430-4f1c-837d-e178404465f5"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod69079e90-e430-4f1c-837d-e178404465f5] : Timed out while waiting for systemd to remove kubepods-burstable-pod69079e90_e430_4f1c_837d_e178404465f5.slice" Dec 03 15:20:13 crc kubenswrapper[4751]: I1203 15:20:13.539451 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sj26z"] Dec 03 15:20:14 crc kubenswrapper[4751]: I1203 15:20:14.314253 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:20:14 crc kubenswrapper[4751]: E1203 15:20:14.314556 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:20:14 crc kubenswrapper[4751]: I1203 15:20:14.684466 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sj26z" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="registry-server" containerID="cri-o://896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665" gracePeriod=2 Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.489086 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.681267 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-catalog-content\") pod \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.681925 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-utilities\") pod \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.682027 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gvcd\" (UniqueName: \"kubernetes.io/projected/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-kube-api-access-7gvcd\") pod \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\" (UID: \"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea\") " Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.682641 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-utilities" (OuterVolumeSpecName: "utilities") pod "d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" (UID: "d0eb6cb0-2d2b-40d8-84e4-b65008a00cea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.689241 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-kube-api-access-7gvcd" (OuterVolumeSpecName: "kube-api-access-7gvcd") pod "d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" (UID: "d0eb6cb0-2d2b-40d8-84e4-b65008a00cea"). InnerVolumeSpecName "kube-api-access-7gvcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.709957 4751 generic.go:334] "Generic (PLEG): container finished" podID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerID="896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665" exitCode=0 Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.709995 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj26z" event={"ID":"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea","Type":"ContainerDied","Data":"896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665"} Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.710020 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj26z" event={"ID":"d0eb6cb0-2d2b-40d8-84e4-b65008a00cea","Type":"ContainerDied","Data":"e1fa35c2b7aaeda4b8a14a68c9c737d7acec720be7316ad9b6f49ce0560ca118"} Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.710037 4751 scope.go:117] "RemoveContainer" containerID="896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665" Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.710055 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj26z" Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.778252 4751 scope.go:117] "RemoveContainer" containerID="df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7" Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.784852 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.784884 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gvcd\" (UniqueName: \"kubernetes.io/projected/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-kube-api-access-7gvcd\") on node \"crc\" DevicePath \"\"" Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.815208 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" (UID: "d0eb6cb0-2d2b-40d8-84e4-b65008a00cea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:20:15 crc kubenswrapper[4751]: I1203 15:20:15.893082 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:20:16 crc kubenswrapper[4751]: I1203 15:20:16.046414 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sj26z"] Dec 03 15:20:16 crc kubenswrapper[4751]: I1203 15:20:16.059253 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sj26z"] Dec 03 15:20:16 crc kubenswrapper[4751]: I1203 15:20:16.472937 4751 scope.go:117] "RemoveContainer" containerID="1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6" Dec 03 15:20:16 crc kubenswrapper[4751]: I1203 15:20:16.521713 4751 scope.go:117] "RemoveContainer" containerID="896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665" Dec 03 15:20:16 crc kubenswrapper[4751]: E1203 15:20:16.522165 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665\": container with ID starting with 896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665 not found: ID does not exist" containerID="896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665" Dec 03 15:20:16 crc kubenswrapper[4751]: I1203 15:20:16.522201 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665"} err="failed to get container status \"896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665\": rpc error: code = NotFound desc = could not find container \"896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665\": container with ID starting with 896ebcded0677e2089b8e6bac2f186d32e9893e91149ad8ce64515b27f72a665 not found: ID does not exist" Dec 03 15:20:16 crc kubenswrapper[4751]: I1203 15:20:16.522224 4751 scope.go:117] "RemoveContainer" containerID="df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7" Dec 03 15:20:16 crc kubenswrapper[4751]: E1203 15:20:16.522796 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7\": container with ID starting with df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7 not found: ID does not exist" containerID="df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7" Dec 03 15:20:16 crc kubenswrapper[4751]: I1203 15:20:16.522827 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7"} err="failed to get container status \"df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7\": rpc error: code = NotFound desc = could not find container \"df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7\": container with ID starting with df13781b6eb15835ce14704ea9c76408cf1cfaa689f2b43a866b27cb5bfdbea7 not found: ID does not exist" Dec 03 15:20:16 crc kubenswrapper[4751]: I1203 15:20:16.522844 4751 scope.go:117] "RemoveContainer" containerID="1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6" Dec 03 15:20:16 crc kubenswrapper[4751]: E1203 15:20:16.523233 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6\": container with ID starting with 1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6 not found: ID does not exist" containerID="1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6" Dec 03 15:20:16 crc kubenswrapper[4751]: I1203 15:20:16.523266 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6"} err="failed to get container status \"1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6\": rpc error: code = NotFound desc = could not find container \"1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6\": container with ID starting with 1f97165d4b04cefd3382fe8703b0e29efeb035b6ac1930a19441d12ec7bf83c6 not found: ID does not exist" Dec 03 15:20:17 crc kubenswrapper[4751]: I1203 15:20:17.325552 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" path="/var/lib/kubelet/pods/d0eb6cb0-2d2b-40d8-84e4-b65008a00cea/volumes" Dec 03 15:20:27 crc kubenswrapper[4751]: I1203 15:20:27.314239 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:20:27 crc kubenswrapper[4751]: E1203 15:20:27.315066 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:20:29 crc kubenswrapper[4751]: I1203 15:20:29.703875 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-bjml6_40a56b56-06b1-4640-b817-4b22a08cdfea/nmstate-console-plugin/0.log" Dec 03 15:20:30 crc kubenswrapper[4751]: I1203 15:20:30.138099 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-znb2g_f41e25fd-38de-45a7-95dd-d0172caa1353/nmstate-handler/0.log" Dec 03 15:20:30 crc kubenswrapper[4751]: I1203 15:20:30.197666 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-zmfgm_2dfea938-9795-4c3d-a42c-f4c7cbe57dae/kube-rbac-proxy/0.log" Dec 03 15:20:30 crc kubenswrapper[4751]: I1203 15:20:30.235733 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-zmfgm_2dfea938-9795-4c3d-a42c-f4c7cbe57dae/nmstate-metrics/0.log" Dec 03 15:20:30 crc kubenswrapper[4751]: I1203 15:20:30.460643 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-4xzfk_65d126c6-b570-490c-bae2-71a3a7fa0832/nmstate-operator/0.log" Dec 03 15:20:30 crc kubenswrapper[4751]: I1203 15:20:30.555061 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-7hdqr_d0b754bd-d0ac-42e5-87a1-6f4132d926a9/nmstate-webhook/0.log" Dec 03 15:20:42 crc kubenswrapper[4751]: I1203 15:20:42.314436 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:20:42 crc kubenswrapper[4751]: E1203 15:20:42.315156 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:20:47 crc kubenswrapper[4751]: I1203 15:20:47.866937 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/manager/1.log" Dec 03 15:20:47 crc kubenswrapper[4751]: I1203 15:20:47.888599 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/kube-rbac-proxy/0.log" Dec 03 15:20:48 crc kubenswrapper[4751]: I1203 15:20:48.137435 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/manager/0.log" Dec 03 15:20:55 crc kubenswrapper[4751]: I1203 15:20:55.314699 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:20:55 crc kubenswrapper[4751]: E1203 15:20:55.315472 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:21:05 crc kubenswrapper[4751]: I1203 15:21:05.630973 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-92ghf_4cd8b40c-f374-4b29-96a6-94137d11fe90/kube-rbac-proxy/0.log" Dec 03 15:21:05 crc kubenswrapper[4751]: I1203 15:21:05.850615 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-92ghf_4cd8b40c-f374-4b29-96a6-94137d11fe90/controller/0.log" Dec 03 15:21:05 crc kubenswrapper[4751]: I1203 15:21:05.950855 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-frr-files/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.204344 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-frr-files/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.269517 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-reloader/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.314165 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:21:06 crc kubenswrapper[4751]: E1203 15:21:06.314564 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.328730 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-metrics/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.337751 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-reloader/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.535158 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-frr-files/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.554782 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-metrics/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.576730 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-metrics/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.632184 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-reloader/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.832797 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-reloader/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.875240 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-metrics/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.884713 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-frr-files/0.log" Dec 03 15:21:06 crc kubenswrapper[4751]: I1203 15:21:06.928111 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/controller/0.log" Dec 03 15:21:07 crc kubenswrapper[4751]: I1203 15:21:07.341840 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/kube-rbac-proxy/0.log" Dec 03 15:21:07 crc kubenswrapper[4751]: I1203 15:21:07.411040 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/frr-metrics/0.log" Dec 03 15:21:07 crc kubenswrapper[4751]: I1203 15:21:07.442697 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/kube-rbac-proxy-frr/0.log" Dec 03 15:21:07 crc kubenswrapper[4751]: I1203 15:21:07.769690 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/reloader/0.log" Dec 03 15:21:07 crc kubenswrapper[4751]: I1203 15:21:07.876818 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-9zqk5_0d711f1f-ab39-4d20-951b-398bd5c7226c/frr-k8s-webhook-server/0.log" Dec 03 15:21:08 crc kubenswrapper[4751]: I1203 15:21:08.445804 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/frr/0.log" Dec 03 15:21:08 crc kubenswrapper[4751]: I1203 15:21:08.669508 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75cdb5998d-hbntt_14960a87-3612-433e-bd1e-b548b0118a2c/manager/1.log" Dec 03 15:21:08 crc kubenswrapper[4751]: I1203 15:21:08.682025 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75cdb5998d-hbntt_14960a87-3612-433e-bd1e-b548b0118a2c/manager/0.log" Dec 03 15:21:09 crc kubenswrapper[4751]: I1203 15:21:09.004592 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7cf457fc-2z2zk_ab9be016-ca60-474a-85d3-7c3ca149e87d/webhook-server/0.log" Dec 03 15:21:09 crc kubenswrapper[4751]: I1203 15:21:09.137927 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rgj2q_3afabeea-33c4-4bed-a2ca-440c78ff75ad/kube-rbac-proxy/0.log" Dec 03 15:21:09 crc kubenswrapper[4751]: I1203 15:21:09.524382 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rgj2q_3afabeea-33c4-4bed-a2ca-440c78ff75ad/speaker/0.log" Dec 03 15:21:18 crc kubenswrapper[4751]: I1203 15:21:18.314992 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:21:18 crc kubenswrapper[4751]: E1203 15:21:18.315928 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:21:25 crc kubenswrapper[4751]: I1203 15:21:25.990634 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/util/0.log" Dec 03 15:21:26 crc kubenswrapper[4751]: I1203 15:21:26.398143 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/pull/0.log" Dec 03 15:21:26 crc kubenswrapper[4751]: I1203 15:21:26.418626 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/pull/0.log" Dec 03 15:21:26 crc kubenswrapper[4751]: I1203 15:21:26.421372 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/util/0.log" Dec 03 15:21:26 crc kubenswrapper[4751]: I1203 15:21:26.587729 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/util/0.log" Dec 03 15:21:26 crc kubenswrapper[4751]: I1203 15:21:26.610023 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/extract/0.log" Dec 03 15:21:26 crc kubenswrapper[4751]: I1203 15:21:26.610083 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/pull/0.log" Dec 03 15:21:26 crc kubenswrapper[4751]: I1203 15:21:26.792147 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/util/0.log" Dec 03 15:21:26 crc kubenswrapper[4751]: I1203 15:21:26.962486 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/util/0.log" Dec 03 15:21:26 crc kubenswrapper[4751]: I1203 15:21:26.996630 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/pull/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.014570 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/pull/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.209234 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/extract/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.271034 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/pull/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.444457 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/util/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.561747 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/util/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.788242 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/pull/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.801263 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/util/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.807531 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/pull/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.989946 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/util/0.log" Dec 03 15:21:27 crc kubenswrapper[4751]: I1203 15:21:27.997571 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/extract/0.log" Dec 03 15:21:28 crc kubenswrapper[4751]: I1203 15:21:28.007520 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/pull/0.log" Dec 03 15:21:28 crc kubenswrapper[4751]: I1203 15:21:28.202113 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/util/0.log" Dec 03 15:21:28 crc kubenswrapper[4751]: I1203 15:21:28.562938 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/util/0.log" Dec 03 15:21:28 crc kubenswrapper[4751]: I1203 15:21:28.634410 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/pull/0.log" Dec 03 15:21:28 crc kubenswrapper[4751]: I1203 15:21:28.670666 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/pull/0.log" Dec 03 15:21:28 crc kubenswrapper[4751]: I1203 15:21:28.828257 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/pull/0.log" Dec 03 15:21:28 crc kubenswrapper[4751]: I1203 15:21:28.857184 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/util/0.log" Dec 03 15:21:28 crc kubenswrapper[4751]: I1203 15:21:28.891689 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/extract/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.053120 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/util/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.211988 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/pull/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.246147 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/util/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.250647 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/pull/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.446989 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/pull/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.462322 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/util/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.517314 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/extract/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.764074 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-utilities/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.924131 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-utilities/0.log" Dec 03 15:21:29 crc kubenswrapper[4751]: I1203 15:21:29.991316 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-content/0.log" Dec 03 15:21:30 crc kubenswrapper[4751]: I1203 15:21:30.002535 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-content/0.log" Dec 03 15:21:30 crc kubenswrapper[4751]: I1203 15:21:30.120864 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-utilities/0.log" Dec 03 15:21:30 crc kubenswrapper[4751]: I1203 15:21:30.135266 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-content/0.log" Dec 03 15:21:30 crc kubenswrapper[4751]: I1203 15:21:30.337490 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-utilities/0.log" Dec 03 15:21:30 crc kubenswrapper[4751]: I1203 15:21:30.841870 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/registry-server/0.log" Dec 03 15:21:31 crc kubenswrapper[4751]: I1203 15:21:31.199566 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-utilities/0.log" Dec 03 15:21:31 crc kubenswrapper[4751]: I1203 15:21:31.292455 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-content/0.log" Dec 03 15:21:31 crc kubenswrapper[4751]: I1203 15:21:31.314703 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:21:31 crc kubenswrapper[4751]: E1203 15:21:31.315036 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:21:31 crc kubenswrapper[4751]: I1203 15:21:31.343309 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-content/0.log" Dec 03 15:21:31 crc kubenswrapper[4751]: I1203 15:21:31.569286 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-content/0.log" Dec 03 15:21:31 crc kubenswrapper[4751]: I1203 15:21:31.577247 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-utilities/0.log" Dec 03 15:21:31 crc kubenswrapper[4751]: I1203 15:21:31.680492 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-sq9k5_224b9e4a-5a71-4559-84b6-9599c2dfd321/marketplace-operator/0.log" Dec 03 15:21:31 crc kubenswrapper[4751]: I1203 15:21:31.889423 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-utilities/0.log" Dec 03 15:21:32 crc kubenswrapper[4751]: I1203 15:21:32.185076 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-content/0.log" Dec 03 15:21:32 crc kubenswrapper[4751]: I1203 15:21:32.189073 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-utilities/0.log" Dec 03 15:21:32 crc kubenswrapper[4751]: I1203 15:21:32.189660 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-content/0.log" Dec 03 15:21:32 crc kubenswrapper[4751]: I1203 15:21:32.403700 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/registry-server/0.log" Dec 03 15:21:32 crc kubenswrapper[4751]: I1203 15:21:32.528198 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-utilities/0.log" Dec 03 15:21:32 crc kubenswrapper[4751]: I1203 15:21:32.573847 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-content/0.log" Dec 03 15:21:32 crc kubenswrapper[4751]: I1203 15:21:32.629666 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/registry-server/0.log" Dec 03 15:21:32 crc kubenswrapper[4751]: I1203 15:21:32.665923 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-utilities/0.log" Dec 03 15:21:33 crc kubenswrapper[4751]: I1203 15:21:33.451762 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-utilities/0.log" Dec 03 15:21:33 crc kubenswrapper[4751]: I1203 15:21:33.463563 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-content/0.log" Dec 03 15:21:33 crc kubenswrapper[4751]: I1203 15:21:33.503461 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-content/0.log" Dec 03 15:21:33 crc kubenswrapper[4751]: I1203 15:21:33.630167 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-content/0.log" Dec 03 15:21:33 crc kubenswrapper[4751]: I1203 15:21:33.657407 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-utilities/0.log" Dec 03 15:21:34 crc kubenswrapper[4751]: I1203 15:21:34.239122 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/registry-server/0.log" Dec 03 15:21:43 crc kubenswrapper[4751]: I1203 15:21:43.323521 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:21:43 crc kubenswrapper[4751]: E1203 15:21:43.324950 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:21:48 crc kubenswrapper[4751]: I1203 15:21:48.181208 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-96kch_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca/prometheus-operator/0.log" Dec 03 15:21:48 crc kubenswrapper[4751]: I1203 15:21:48.350439 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_7e248019-bf73-4c6d-a551-6c62dcf6ec11/prometheus-operator-admission-webhook/0.log" Dec 03 15:21:48 crc kubenswrapper[4751]: I1203 15:21:48.417764 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_01521e70-1366-4e52-9f9a-885522387a0e/prometheus-operator-admission-webhook/0.log" Dec 03 15:21:48 crc kubenswrapper[4751]: I1203 15:21:48.601227 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-rgl9w_216e104c-e7e3-4be4-972c-cd524973eaa6/operator/0.log" Dec 03 15:21:48 crc kubenswrapper[4751]: I1203 15:21:48.802743 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-hq59z_bb80946e-134f-4baa-b150-6004a9313de9/perses-operator/0.log" Dec 03 15:21:58 crc kubenswrapper[4751]: I1203 15:21:58.314146 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:21:58 crc kubenswrapper[4751]: E1203 15:21:58.314997 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:22:03 crc kubenswrapper[4751]: I1203 15:22:03.343052 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/kube-rbac-proxy/0.log" Dec 03 15:22:03 crc kubenswrapper[4751]: I1203 15:22:03.509388 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/manager/1.log" Dec 03 15:22:03 crc kubenswrapper[4751]: I1203 15:22:03.563874 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/manager/0.log" Dec 03 15:22:09 crc kubenswrapper[4751]: I1203 15:22:09.314916 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:22:09 crc kubenswrapper[4751]: E1203 15:22:09.315738 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:22:20 crc kubenswrapper[4751]: I1203 15:22:20.314217 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:22:20 crc kubenswrapper[4751]: E1203 15:22:20.315035 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:22:34 crc kubenswrapper[4751]: I1203 15:22:34.314580 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:22:34 crc kubenswrapper[4751]: E1203 15:22:34.315382 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:22:47 crc kubenswrapper[4751]: I1203 15:22:47.320265 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:22:48 crc kubenswrapper[4751]: I1203 15:22:48.358299 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"a0be2e6b634e60ad28b976fe586002efe94eaf26e24cfc54b5fd477b10cdbf97"} Dec 03 15:24:18 crc kubenswrapper[4751]: I1203 15:24:18.362073 4751 generic.go:334] "Generic (PLEG): container finished" podID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" containerID="575faeab859105060ca850e0f055102c5bdbe3579fcd1bde1217cbe6b430e554" exitCode=0 Dec 03 15:24:18 crc kubenswrapper[4751]: I1203 15:24:18.362166 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxk5b/must-gather-krlcs" event={"ID":"108aeb3c-d5f1-4cab-80d4-9c55592975d6","Type":"ContainerDied","Data":"575faeab859105060ca850e0f055102c5bdbe3579fcd1bde1217cbe6b430e554"} Dec 03 15:24:18 crc kubenswrapper[4751]: I1203 15:24:18.364176 4751 scope.go:117] "RemoveContainer" containerID="575faeab859105060ca850e0f055102c5bdbe3579fcd1bde1217cbe6b430e554" Dec 03 15:24:18 crc kubenswrapper[4751]: I1203 15:24:18.591490 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fxk5b_must-gather-krlcs_108aeb3c-d5f1-4cab-80d4-9c55592975d6/gather/0.log" Dec 03 15:24:27 crc kubenswrapper[4751]: I1203 15:24:27.273552 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fxk5b/must-gather-krlcs"] Dec 03 15:24:27 crc kubenswrapper[4751]: I1203 15:24:27.274406 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fxk5b/must-gather-krlcs" podUID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" containerName="copy" containerID="cri-o://515e4af780d2dc2ee62c0c2305d150b0f8f7e3846b72d153c0ca2219a888b465" gracePeriod=2 Dec 03 15:24:27 crc kubenswrapper[4751]: I1203 15:24:27.284350 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fxk5b/must-gather-krlcs"] Dec 03 15:24:27 crc kubenswrapper[4751]: I1203 15:24:27.474496 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fxk5b_must-gather-krlcs_108aeb3c-d5f1-4cab-80d4-9c55592975d6/copy/0.log" Dec 03 15:24:27 crc kubenswrapper[4751]: I1203 15:24:27.474893 4751 generic.go:334] "Generic (PLEG): container finished" podID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" containerID="515e4af780d2dc2ee62c0c2305d150b0f8f7e3846b72d153c0ca2219a888b465" exitCode=143 Dec 03 15:24:27 crc kubenswrapper[4751]: I1203 15:24:27.908958 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fxk5b_must-gather-krlcs_108aeb3c-d5f1-4cab-80d4-9c55592975d6/copy/0.log" Dec 03 15:24:27 crc kubenswrapper[4751]: I1203 15:24:27.910265 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:24:27 crc kubenswrapper[4751]: I1203 15:24:27.996595 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/108aeb3c-d5f1-4cab-80d4-9c55592975d6-must-gather-output\") pod \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\" (UID: \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\") " Dec 03 15:24:27 crc kubenswrapper[4751]: I1203 15:24:27.996810 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-988s8\" (UniqueName: \"kubernetes.io/projected/108aeb3c-d5f1-4cab-80d4-9c55592975d6-kube-api-access-988s8\") pod \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\" (UID: \"108aeb3c-d5f1-4cab-80d4-9c55592975d6\") " Dec 03 15:24:28 crc kubenswrapper[4751]: I1203 15:24:28.009813 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/108aeb3c-d5f1-4cab-80d4-9c55592975d6-kube-api-access-988s8" (OuterVolumeSpecName: "kube-api-access-988s8") pod "108aeb3c-d5f1-4cab-80d4-9c55592975d6" (UID: "108aeb3c-d5f1-4cab-80d4-9c55592975d6"). InnerVolumeSpecName "kube-api-access-988s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:24:28 crc kubenswrapper[4751]: I1203 15:24:28.099472 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-988s8\" (UniqueName: \"kubernetes.io/projected/108aeb3c-d5f1-4cab-80d4-9c55592975d6-kube-api-access-988s8\") on node \"crc\" DevicePath \"\"" Dec 03 15:24:28 crc kubenswrapper[4751]: I1203 15:24:28.196783 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/108aeb3c-d5f1-4cab-80d4-9c55592975d6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "108aeb3c-d5f1-4cab-80d4-9c55592975d6" (UID: "108aeb3c-d5f1-4cab-80d4-9c55592975d6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:24:28 crc kubenswrapper[4751]: I1203 15:24:28.201474 4751 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/108aeb3c-d5f1-4cab-80d4-9c55592975d6-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 15:24:28 crc kubenswrapper[4751]: I1203 15:24:28.486094 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fxk5b_must-gather-krlcs_108aeb3c-d5f1-4cab-80d4-9c55592975d6/copy/0.log" Dec 03 15:24:28 crc kubenswrapper[4751]: I1203 15:24:28.486766 4751 scope.go:117] "RemoveContainer" containerID="515e4af780d2dc2ee62c0c2305d150b0f8f7e3846b72d153c0ca2219a888b465" Dec 03 15:24:28 crc kubenswrapper[4751]: I1203 15:24:28.486838 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxk5b/must-gather-krlcs" Dec 03 15:24:28 crc kubenswrapper[4751]: I1203 15:24:28.507158 4751 scope.go:117] "RemoveContainer" containerID="575faeab859105060ca850e0f055102c5bdbe3579fcd1bde1217cbe6b430e554" Dec 03 15:24:29 crc kubenswrapper[4751]: I1203 15:24:29.327123 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" path="/var/lib/kubelet/pods/108aeb3c-d5f1-4cab-80d4-9c55592975d6/volumes" Dec 03 15:25:05 crc kubenswrapper[4751]: I1203 15:25:05.819464 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:25:05 crc kubenswrapper[4751]: I1203 15:25:05.819996 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:25:35 crc kubenswrapper[4751]: I1203 15:25:35.819748 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:25:35 crc kubenswrapper[4751]: I1203 15:25:35.820391 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.044482 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lncx7"] Dec 03 15:26:04 crc kubenswrapper[4751]: E1203 15:26:04.045854 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" containerName="copy" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.045873 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" containerName="copy" Dec 03 15:26:04 crc kubenswrapper[4751]: E1203 15:26:04.045904 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="registry-server" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.045916 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="registry-server" Dec 03 15:26:04 crc kubenswrapper[4751]: E1203 15:26:04.045938 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="extract-content" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.045949 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="extract-content" Dec 03 15:26:04 crc kubenswrapper[4751]: E1203 15:26:04.045962 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" containerName="gather" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.045969 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" containerName="gather" Dec 03 15:26:04 crc kubenswrapper[4751]: E1203 15:26:04.045987 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69079e90-e430-4f1c-837d-e178404465f5" containerName="registry-server" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.045995 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="69079e90-e430-4f1c-837d-e178404465f5" containerName="registry-server" Dec 03 15:26:04 crc kubenswrapper[4751]: E1203 15:26:04.046009 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69079e90-e430-4f1c-837d-e178404465f5" containerName="extract-utilities" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.046016 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="69079e90-e430-4f1c-837d-e178404465f5" containerName="extract-utilities" Dec 03 15:26:04 crc kubenswrapper[4751]: E1203 15:26:04.046027 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="extract-utilities" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.046033 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="extract-utilities" Dec 03 15:26:04 crc kubenswrapper[4751]: E1203 15:26:04.046043 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69079e90-e430-4f1c-837d-e178404465f5" containerName="extract-content" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.046050 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="69079e90-e430-4f1c-837d-e178404465f5" containerName="extract-content" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.046303 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="69079e90-e430-4f1c-837d-e178404465f5" containerName="registry-server" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.046352 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0eb6cb0-2d2b-40d8-84e4-b65008a00cea" containerName="registry-server" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.046366 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" containerName="gather" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.046379 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="108aeb3c-d5f1-4cab-80d4-9c55592975d6" containerName="copy" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.048342 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.069863 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncx7"] Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.221179 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzhp8\" (UniqueName: \"kubernetes.io/projected/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-kube-api-access-vzhp8\") pod \"redhat-marketplace-lncx7\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.221387 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-utilities\") pod \"redhat-marketplace-lncx7\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.221425 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-catalog-content\") pod \"redhat-marketplace-lncx7\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.323295 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzhp8\" (UniqueName: \"kubernetes.io/projected/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-kube-api-access-vzhp8\") pod \"redhat-marketplace-lncx7\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.323448 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-utilities\") pod \"redhat-marketplace-lncx7\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.323474 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-catalog-content\") pod \"redhat-marketplace-lncx7\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.323951 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-catalog-content\") pod \"redhat-marketplace-lncx7\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.324030 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-utilities\") pod \"redhat-marketplace-lncx7\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.350303 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzhp8\" (UniqueName: \"kubernetes.io/projected/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-kube-api-access-vzhp8\") pod \"redhat-marketplace-lncx7\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.387728 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:04 crc kubenswrapper[4751]: I1203 15:26:04.913922 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncx7"] Dec 03 15:26:05 crc kubenswrapper[4751]: I1203 15:26:05.481521 4751 generic.go:334] "Generic (PLEG): container finished" podID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerID="4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036" exitCode=0 Dec 03 15:26:05 crc kubenswrapper[4751]: I1203 15:26:05.481617 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncx7" event={"ID":"6b3d4224-93fd-4f2b-bbfb-b6478e59df96","Type":"ContainerDied","Data":"4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036"} Dec 03 15:26:05 crc kubenswrapper[4751]: I1203 15:26:05.481873 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncx7" event={"ID":"6b3d4224-93fd-4f2b-bbfb-b6478e59df96","Type":"ContainerStarted","Data":"37824580b2213d629c32d1f80899d1066291c1c26dd6f77c3686cb06d7191602"} Dec 03 15:26:05 crc kubenswrapper[4751]: I1203 15:26:05.484498 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 15:26:05 crc kubenswrapper[4751]: I1203 15:26:05.820032 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:26:05 crc kubenswrapper[4751]: I1203 15:26:05.820135 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:26:05 crc kubenswrapper[4751]: I1203 15:26:05.820198 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 15:26:05 crc kubenswrapper[4751]: I1203 15:26:05.821069 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0be2e6b634e60ad28b976fe586002efe94eaf26e24cfc54b5fd477b10cdbf97"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:26:05 crc kubenswrapper[4751]: I1203 15:26:05.821138 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://a0be2e6b634e60ad28b976fe586002efe94eaf26e24cfc54b5fd477b10cdbf97" gracePeriod=600 Dec 03 15:26:06 crc kubenswrapper[4751]: I1203 15:26:06.496199 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="a0be2e6b634e60ad28b976fe586002efe94eaf26e24cfc54b5fd477b10cdbf97" exitCode=0 Dec 03 15:26:06 crc kubenswrapper[4751]: I1203 15:26:06.496771 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"a0be2e6b634e60ad28b976fe586002efe94eaf26e24cfc54b5fd477b10cdbf97"} Dec 03 15:26:06 crc kubenswrapper[4751]: I1203 15:26:06.497072 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642"} Dec 03 15:26:06 crc kubenswrapper[4751]: I1203 15:26:06.497100 4751 scope.go:117] "RemoveContainer" containerID="63ada114a3c0d13c65067734e76c0573932a816604c9a01a3d2887bb9ea85dda" Dec 03 15:26:06 crc kubenswrapper[4751]: I1203 15:26:06.514688 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncx7" event={"ID":"6b3d4224-93fd-4f2b-bbfb-b6478e59df96","Type":"ContainerStarted","Data":"25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70"} Dec 03 15:26:07 crc kubenswrapper[4751]: I1203 15:26:07.532617 4751 generic.go:334] "Generic (PLEG): container finished" podID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerID="25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70" exitCode=0 Dec 03 15:26:07 crc kubenswrapper[4751]: I1203 15:26:07.532695 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncx7" event={"ID":"6b3d4224-93fd-4f2b-bbfb-b6478e59df96","Type":"ContainerDied","Data":"25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70"} Dec 03 15:26:08 crc kubenswrapper[4751]: I1203 15:26:08.547159 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncx7" event={"ID":"6b3d4224-93fd-4f2b-bbfb-b6478e59df96","Type":"ContainerStarted","Data":"e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa"} Dec 03 15:26:08 crc kubenswrapper[4751]: I1203 15:26:08.575902 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lncx7" podStartSLOduration=2.136361906 podStartE2EDuration="4.575876929s" podCreationTimestamp="2025-12-03 15:26:04 +0000 UTC" firstStartedPulling="2025-12-03 15:26:05.484264478 +0000 UTC m=+4372.472619695" lastFinishedPulling="2025-12-03 15:26:07.923779501 +0000 UTC m=+4374.912134718" observedRunningTime="2025-12-03 15:26:08.568276227 +0000 UTC m=+4375.556631464" watchObservedRunningTime="2025-12-03 15:26:08.575876929 +0000 UTC m=+4375.564232146" Dec 03 15:26:14 crc kubenswrapper[4751]: I1203 15:26:14.388064 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:14 crc kubenswrapper[4751]: I1203 15:26:14.389542 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:14 crc kubenswrapper[4751]: I1203 15:26:14.437342 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:14 crc kubenswrapper[4751]: I1203 15:26:14.676454 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:14 crc kubenswrapper[4751]: I1203 15:26:14.730452 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncx7"] Dec 03 15:26:16 crc kubenswrapper[4751]: I1203 15:26:16.647440 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lncx7" podUID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerName="registry-server" containerID="cri-o://e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa" gracePeriod=2 Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.359602 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.531641 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-catalog-content\") pod \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.531969 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzhp8\" (UniqueName: \"kubernetes.io/projected/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-kube-api-access-vzhp8\") pod \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.532003 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-utilities\") pod \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\" (UID: \"6b3d4224-93fd-4f2b-bbfb-b6478e59df96\") " Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.533266 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-utilities" (OuterVolumeSpecName: "utilities") pod "6b3d4224-93fd-4f2b-bbfb-b6478e59df96" (UID: "6b3d4224-93fd-4f2b-bbfb-b6478e59df96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.546714 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-kube-api-access-vzhp8" (OuterVolumeSpecName: "kube-api-access-vzhp8") pod "6b3d4224-93fd-4f2b-bbfb-b6478e59df96" (UID: "6b3d4224-93fd-4f2b-bbfb-b6478e59df96"). InnerVolumeSpecName "kube-api-access-vzhp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.559799 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b3d4224-93fd-4f2b-bbfb-b6478e59df96" (UID: "6b3d4224-93fd-4f2b-bbfb-b6478e59df96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.635033 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.635098 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzhp8\" (UniqueName: \"kubernetes.io/projected/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-kube-api-access-vzhp8\") on node \"crc\" DevicePath \"\"" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.635113 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d4224-93fd-4f2b-bbfb-b6478e59df96-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.664616 4751 generic.go:334] "Generic (PLEG): container finished" podID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerID="e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa" exitCode=0 Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.664676 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncx7" event={"ID":"6b3d4224-93fd-4f2b-bbfb-b6478e59df96","Type":"ContainerDied","Data":"e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa"} Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.664715 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncx7" event={"ID":"6b3d4224-93fd-4f2b-bbfb-b6478e59df96","Type":"ContainerDied","Data":"37824580b2213d629c32d1f80899d1066291c1c26dd6f77c3686cb06d7191602"} Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.664737 4751 scope.go:117] "RemoveContainer" containerID="e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.664769 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lncx7" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.697582 4751 scope.go:117] "RemoveContainer" containerID="25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.710719 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncx7"] Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.723201 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncx7"] Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.732267 4751 scope.go:117] "RemoveContainer" containerID="4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.772252 4751 scope.go:117] "RemoveContainer" containerID="e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa" Dec 03 15:26:17 crc kubenswrapper[4751]: E1203 15:26:17.773064 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa\": container with ID starting with e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa not found: ID does not exist" containerID="e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.773122 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa"} err="failed to get container status \"e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa\": rpc error: code = NotFound desc = could not find container \"e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa\": container with ID starting with e7df54ef6e76b27cde02baf43e858de7e5f8a1e23afd71a2dfafafe886566aaa not found: ID does not exist" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.773161 4751 scope.go:117] "RemoveContainer" containerID="25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70" Dec 03 15:26:17 crc kubenswrapper[4751]: E1203 15:26:17.773948 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70\": container with ID starting with 25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70 not found: ID does not exist" containerID="25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.773990 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70"} err="failed to get container status \"25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70\": rpc error: code = NotFound desc = could not find container \"25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70\": container with ID starting with 25302a198c99b001c962f68c17c00c29b74b059ffe77dbc15a8cb23b8561ae70 not found: ID does not exist" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.774017 4751 scope.go:117] "RemoveContainer" containerID="4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036" Dec 03 15:26:17 crc kubenswrapper[4751]: E1203 15:26:17.774513 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036\": container with ID starting with 4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036 not found: ID does not exist" containerID="4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036" Dec 03 15:26:17 crc kubenswrapper[4751]: I1203 15:26:17.774549 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036"} err="failed to get container status \"4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036\": rpc error: code = NotFound desc = could not find container \"4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036\": container with ID starting with 4d725acde84c902b5eca9207784ab96ba49461c0e3216d21a18ab882d9fa9036 not found: ID does not exist" Dec 03 15:26:19 crc kubenswrapper[4751]: I1203 15:26:19.326872 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" path="/var/lib/kubelet/pods/6b3d4224-93fd-4f2b-bbfb-b6478e59df96/volumes" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.058570 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jhrqk/must-gather-twrb7"] Dec 03 15:27:32 crc kubenswrapper[4751]: E1203 15:27:32.059645 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerName="extract-content" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.059666 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerName="extract-content" Dec 03 15:27:32 crc kubenswrapper[4751]: E1203 15:27:32.059722 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerName="extract-utilities" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.059731 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerName="extract-utilities" Dec 03 15:27:32 crc kubenswrapper[4751]: E1203 15:27:32.059747 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerName="registry-server" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.059756 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerName="registry-server" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.060039 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3d4224-93fd-4f2b-bbfb-b6478e59df96" containerName="registry-server" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.061609 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.072806 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jhrqk"/"kube-root-ca.crt" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.081117 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jhrqk"/"openshift-service-ca.crt" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.110087 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jhrqk/must-gather-twrb7"] Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.179240 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnx6j\" (UniqueName: \"kubernetes.io/projected/70603fe9-149b-4400-b573-52e0a3ecc142-kube-api-access-lnx6j\") pod \"must-gather-twrb7\" (UID: \"70603fe9-149b-4400-b573-52e0a3ecc142\") " pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.179438 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70603fe9-149b-4400-b573-52e0a3ecc142-must-gather-output\") pod \"must-gather-twrb7\" (UID: \"70603fe9-149b-4400-b573-52e0a3ecc142\") " pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.281623 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnx6j\" (UniqueName: \"kubernetes.io/projected/70603fe9-149b-4400-b573-52e0a3ecc142-kube-api-access-lnx6j\") pod \"must-gather-twrb7\" (UID: \"70603fe9-149b-4400-b573-52e0a3ecc142\") " pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.281747 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70603fe9-149b-4400-b573-52e0a3ecc142-must-gather-output\") pod \"must-gather-twrb7\" (UID: \"70603fe9-149b-4400-b573-52e0a3ecc142\") " pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.282238 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70603fe9-149b-4400-b573-52e0a3ecc142-must-gather-output\") pod \"must-gather-twrb7\" (UID: \"70603fe9-149b-4400-b573-52e0a3ecc142\") " pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.957234 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnx6j\" (UniqueName: \"kubernetes.io/projected/70603fe9-149b-4400-b573-52e0a3ecc142-kube-api-access-lnx6j\") pod \"must-gather-twrb7\" (UID: \"70603fe9-149b-4400-b573-52e0a3ecc142\") " pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:27:32 crc kubenswrapper[4751]: I1203 15:27:32.986218 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:27:33 crc kubenswrapper[4751]: I1203 15:27:33.442465 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jhrqk/must-gather-twrb7"] Dec 03 15:27:33 crc kubenswrapper[4751]: I1203 15:27:33.517574 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/must-gather-twrb7" event={"ID":"70603fe9-149b-4400-b573-52e0a3ecc142","Type":"ContainerStarted","Data":"e0ef5d67ee511029b9dfdcbc9d921b6e46383589564fb2b1f6495325aec17379"} Dec 03 15:27:34 crc kubenswrapper[4751]: I1203 15:27:34.533599 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/must-gather-twrb7" event={"ID":"70603fe9-149b-4400-b573-52e0a3ecc142","Type":"ContainerStarted","Data":"12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0"} Dec 03 15:27:34 crc kubenswrapper[4751]: I1203 15:27:34.533998 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/must-gather-twrb7" event={"ID":"70603fe9-149b-4400-b573-52e0a3ecc142","Type":"ContainerStarted","Data":"881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6"} Dec 03 15:27:34 crc kubenswrapper[4751]: I1203 15:27:34.577170 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jhrqk/must-gather-twrb7" podStartSLOduration=2.577147213 podStartE2EDuration="2.577147213s" podCreationTimestamp="2025-12-03 15:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 15:27:34.564198136 +0000 UTC m=+4461.552553363" watchObservedRunningTime="2025-12-03 15:27:34.577147213 +0000 UTC m=+4461.565502430" Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.494251 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jhrqk/crc-debug-skcpd"] Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.497592 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.500484 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jhrqk"/"default-dockercfg-652g9" Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.594020 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6xjf\" (UniqueName: \"kubernetes.io/projected/6bf3df1a-56a5-4601-8ee5-606c8b143268-kube-api-access-z6xjf\") pod \"crc-debug-skcpd\" (UID: \"6bf3df1a-56a5-4601-8ee5-606c8b143268\") " pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.594082 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bf3df1a-56a5-4601-8ee5-606c8b143268-host\") pod \"crc-debug-skcpd\" (UID: \"6bf3df1a-56a5-4601-8ee5-606c8b143268\") " pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.697009 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6xjf\" (UniqueName: \"kubernetes.io/projected/6bf3df1a-56a5-4601-8ee5-606c8b143268-kube-api-access-z6xjf\") pod \"crc-debug-skcpd\" (UID: \"6bf3df1a-56a5-4601-8ee5-606c8b143268\") " pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.697100 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bf3df1a-56a5-4601-8ee5-606c8b143268-host\") pod \"crc-debug-skcpd\" (UID: \"6bf3df1a-56a5-4601-8ee5-606c8b143268\") " pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.697539 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bf3df1a-56a5-4601-8ee5-606c8b143268-host\") pod \"crc-debug-skcpd\" (UID: \"6bf3df1a-56a5-4601-8ee5-606c8b143268\") " pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.724087 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6xjf\" (UniqueName: \"kubernetes.io/projected/6bf3df1a-56a5-4601-8ee5-606c8b143268-kube-api-access-z6xjf\") pod \"crc-debug-skcpd\" (UID: \"6bf3df1a-56a5-4601-8ee5-606c8b143268\") " pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:27:37 crc kubenswrapper[4751]: I1203 15:27:37.827216 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:27:37 crc kubenswrapper[4751]: W1203 15:27:37.861501 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bf3df1a_56a5_4601_8ee5_606c8b143268.slice/crio-359e16733d41084e63cf9a4a6b98998b58dbed8987ad17e8cdb053a034333bf0 WatchSource:0}: Error finding container 359e16733d41084e63cf9a4a6b98998b58dbed8987ad17e8cdb053a034333bf0: Status 404 returned error can't find the container with id 359e16733d41084e63cf9a4a6b98998b58dbed8987ad17e8cdb053a034333bf0 Dec 03 15:27:38 crc kubenswrapper[4751]: I1203 15:27:38.571421 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/crc-debug-skcpd" event={"ID":"6bf3df1a-56a5-4601-8ee5-606c8b143268","Type":"ContainerStarted","Data":"54b6194ac866a8d6a0f7810ce58fd1e1d781ee6bca81fe855009dae8acf16820"} Dec 03 15:27:38 crc kubenswrapper[4751]: I1203 15:27:38.572287 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/crc-debug-skcpd" event={"ID":"6bf3df1a-56a5-4601-8ee5-606c8b143268","Type":"ContainerStarted","Data":"359e16733d41084e63cf9a4a6b98998b58dbed8987ad17e8cdb053a034333bf0"} Dec 03 15:27:38 crc kubenswrapper[4751]: I1203 15:27:38.602177 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jhrqk/crc-debug-skcpd" podStartSLOduration=1.6020983370000001 podStartE2EDuration="1.602098337s" podCreationTimestamp="2025-12-03 15:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 15:27:38.586674233 +0000 UTC m=+4465.575029460" watchObservedRunningTime="2025-12-03 15:27:38.602098337 +0000 UTC m=+4465.590453664" Dec 03 15:28:30 crc kubenswrapper[4751]: I1203 15:28:30.393020 4751 generic.go:334] "Generic (PLEG): container finished" podID="6bf3df1a-56a5-4601-8ee5-606c8b143268" containerID="54b6194ac866a8d6a0f7810ce58fd1e1d781ee6bca81fe855009dae8acf16820" exitCode=0 Dec 03 15:28:30 crc kubenswrapper[4751]: I1203 15:28:30.393105 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/crc-debug-skcpd" event={"ID":"6bf3df1a-56a5-4601-8ee5-606c8b143268","Type":"ContainerDied","Data":"54b6194ac866a8d6a0f7810ce58fd1e1d781ee6bca81fe855009dae8acf16820"} Dec 03 15:28:31 crc kubenswrapper[4751]: I1203 15:28:31.539647 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:28:31 crc kubenswrapper[4751]: I1203 15:28:31.585755 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jhrqk/crc-debug-skcpd"] Dec 03 15:28:31 crc kubenswrapper[4751]: I1203 15:28:31.600586 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bf3df1a-56a5-4601-8ee5-606c8b143268-host\") pod \"6bf3df1a-56a5-4601-8ee5-606c8b143268\" (UID: \"6bf3df1a-56a5-4601-8ee5-606c8b143268\") " Dec 03 15:28:31 crc kubenswrapper[4751]: I1203 15:28:31.600660 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6xjf\" (UniqueName: \"kubernetes.io/projected/6bf3df1a-56a5-4601-8ee5-606c8b143268-kube-api-access-z6xjf\") pod \"6bf3df1a-56a5-4601-8ee5-606c8b143268\" (UID: \"6bf3df1a-56a5-4601-8ee5-606c8b143268\") " Dec 03 15:28:31 crc kubenswrapper[4751]: I1203 15:28:31.600744 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bf3df1a-56a5-4601-8ee5-606c8b143268-host" (OuterVolumeSpecName: "host") pod "6bf3df1a-56a5-4601-8ee5-606c8b143268" (UID: "6bf3df1a-56a5-4601-8ee5-606c8b143268"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:28:31 crc kubenswrapper[4751]: I1203 15:28:31.600771 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jhrqk/crc-debug-skcpd"] Dec 03 15:28:31 crc kubenswrapper[4751]: I1203 15:28:31.601306 4751 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bf3df1a-56a5-4601-8ee5-606c8b143268-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:28:31 crc kubenswrapper[4751]: I1203 15:28:31.609089 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf3df1a-56a5-4601-8ee5-606c8b143268-kube-api-access-z6xjf" (OuterVolumeSpecName: "kube-api-access-z6xjf") pod "6bf3df1a-56a5-4601-8ee5-606c8b143268" (UID: "6bf3df1a-56a5-4601-8ee5-606c8b143268"). InnerVolumeSpecName "kube-api-access-z6xjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:28:31 crc kubenswrapper[4751]: I1203 15:28:31.703864 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6xjf\" (UniqueName: \"kubernetes.io/projected/6bf3df1a-56a5-4601-8ee5-606c8b143268-kube-api-access-z6xjf\") on node \"crc\" DevicePath \"\"" Dec 03 15:28:32 crc kubenswrapper[4751]: I1203 15:28:32.414045 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="359e16733d41084e63cf9a4a6b98998b58dbed8987ad17e8cdb053a034333bf0" Dec 03 15:28:32 crc kubenswrapper[4751]: I1203 15:28:32.414115 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-skcpd" Dec 03 15:28:32 crc kubenswrapper[4751]: I1203 15:28:32.780830 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jhrqk/crc-debug-8dvh7"] Dec 03 15:28:32 crc kubenswrapper[4751]: E1203 15:28:32.782802 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf3df1a-56a5-4601-8ee5-606c8b143268" containerName="container-00" Dec 03 15:28:32 crc kubenswrapper[4751]: I1203 15:28:32.782833 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf3df1a-56a5-4601-8ee5-606c8b143268" containerName="container-00" Dec 03 15:28:32 crc kubenswrapper[4751]: I1203 15:28:32.783061 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf3df1a-56a5-4601-8ee5-606c8b143268" containerName="container-00" Dec 03 15:28:32 crc kubenswrapper[4751]: I1203 15:28:32.783913 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:32 crc kubenswrapper[4751]: I1203 15:28:32.786531 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jhrqk"/"default-dockercfg-652g9" Dec 03 15:28:32 crc kubenswrapper[4751]: I1203 15:28:32.929235 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rqzl\" (UniqueName: \"kubernetes.io/projected/e1e33da4-efdf-43eb-9bcb-dad33839bba4-kube-api-access-5rqzl\") pod \"crc-debug-8dvh7\" (UID: \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\") " pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:32 crc kubenswrapper[4751]: I1203 15:28:32.929741 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e33da4-efdf-43eb-9bcb-dad33839bba4-host\") pod \"crc-debug-8dvh7\" (UID: \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\") " pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:33 crc kubenswrapper[4751]: I1203 15:28:33.032129 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rqzl\" (UniqueName: \"kubernetes.io/projected/e1e33da4-efdf-43eb-9bcb-dad33839bba4-kube-api-access-5rqzl\") pod \"crc-debug-8dvh7\" (UID: \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\") " pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:33 crc kubenswrapper[4751]: I1203 15:28:33.032288 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e33da4-efdf-43eb-9bcb-dad33839bba4-host\") pod \"crc-debug-8dvh7\" (UID: \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\") " pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:33 crc kubenswrapper[4751]: I1203 15:28:33.032503 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e33da4-efdf-43eb-9bcb-dad33839bba4-host\") pod \"crc-debug-8dvh7\" (UID: \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\") " pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:33 crc kubenswrapper[4751]: I1203 15:28:33.067305 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rqzl\" (UniqueName: \"kubernetes.io/projected/e1e33da4-efdf-43eb-9bcb-dad33839bba4-kube-api-access-5rqzl\") pod \"crc-debug-8dvh7\" (UID: \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\") " pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:33 crc kubenswrapper[4751]: I1203 15:28:33.104111 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:33 crc kubenswrapper[4751]: I1203 15:28:33.328779 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf3df1a-56a5-4601-8ee5-606c8b143268" path="/var/lib/kubelet/pods/6bf3df1a-56a5-4601-8ee5-606c8b143268/volumes" Dec 03 15:28:33 crc kubenswrapper[4751]: I1203 15:28:33.425944 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" event={"ID":"e1e33da4-efdf-43eb-9bcb-dad33839bba4","Type":"ContainerStarted","Data":"09dce35f2e2b5c453fe157d4107c5b2e2715c79bb3837fe974e833288a83f6ee"} Dec 03 15:28:33 crc kubenswrapper[4751]: I1203 15:28:33.425993 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" event={"ID":"e1e33da4-efdf-43eb-9bcb-dad33839bba4","Type":"ContainerStarted","Data":"2881837cb62a7ebed9bde6ada6a7fa67d37422925470395380b6ba130615ab1a"} Dec 03 15:28:33 crc kubenswrapper[4751]: I1203 15:28:33.454127 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" podStartSLOduration=1.454107074 podStartE2EDuration="1.454107074s" podCreationTimestamp="2025-12-03 15:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 15:28:33.446030377 +0000 UTC m=+4520.434385594" watchObservedRunningTime="2025-12-03 15:28:33.454107074 +0000 UTC m=+4520.442462291" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.106714 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-shx2q"] Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.109778 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.138271 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shx2q"] Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.154035 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl8xn\" (UniqueName: \"kubernetes.io/projected/3cafb339-3c19-48b7-8012-27f57e539659-kube-api-access-nl8xn\") pod \"certified-operators-shx2q\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.155235 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-catalog-content\") pod \"certified-operators-shx2q\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.155412 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-utilities\") pod \"certified-operators-shx2q\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.257195 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-utilities\") pod \"certified-operators-shx2q\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.257682 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl8xn\" (UniqueName: \"kubernetes.io/projected/3cafb339-3c19-48b7-8012-27f57e539659-kube-api-access-nl8xn\") pod \"certified-operators-shx2q\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.257866 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-catalog-content\") pod \"certified-operators-shx2q\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.258469 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-catalog-content\") pod \"certified-operators-shx2q\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.258742 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-utilities\") pod \"certified-operators-shx2q\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.288310 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl8xn\" (UniqueName: \"kubernetes.io/projected/3cafb339-3c19-48b7-8012-27f57e539659-kube-api-access-nl8xn\") pod \"certified-operators-shx2q\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.441669 4751 generic.go:334] "Generic (PLEG): container finished" podID="e1e33da4-efdf-43eb-9bcb-dad33839bba4" containerID="09dce35f2e2b5c453fe157d4107c5b2e2715c79bb3837fe974e833288a83f6ee" exitCode=0 Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.441723 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" event={"ID":"e1e33da4-efdf-43eb-9bcb-dad33839bba4","Type":"ContainerDied","Data":"09dce35f2e2b5c453fe157d4107c5b2e2715c79bb3837fe974e833288a83f6ee"} Dec 03 15:28:34 crc kubenswrapper[4751]: I1203 15:28:34.450540 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:35 crc kubenswrapper[4751]: I1203 15:28:35.160190 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shx2q"] Dec 03 15:28:35 crc kubenswrapper[4751]: I1203 15:28:35.461049 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shx2q" event={"ID":"3cafb339-3c19-48b7-8012-27f57e539659","Type":"ContainerStarted","Data":"73cda4cb002cda260cd58a825666a61d0cd878bea7f9364c551aea7a4150aca1"} Dec 03 15:28:35 crc kubenswrapper[4751]: I1203 15:28:35.820499 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:28:35 crc kubenswrapper[4751]: I1203 15:28:35.820577 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.340398 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.389309 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jhrqk/crc-debug-8dvh7"] Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.401895 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jhrqk/crc-debug-8dvh7"] Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.415685 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rqzl\" (UniqueName: \"kubernetes.io/projected/e1e33da4-efdf-43eb-9bcb-dad33839bba4-kube-api-access-5rqzl\") pod \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\" (UID: \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\") " Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.416355 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1e33da4-efdf-43eb-9bcb-dad33839bba4-host" (OuterVolumeSpecName: "host") pod "e1e33da4-efdf-43eb-9bcb-dad33839bba4" (UID: "e1e33da4-efdf-43eb-9bcb-dad33839bba4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.415846 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e33da4-efdf-43eb-9bcb-dad33839bba4-host\") pod \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\" (UID: \"e1e33da4-efdf-43eb-9bcb-dad33839bba4\") " Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.417395 4751 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e33da4-efdf-43eb-9bcb-dad33839bba4-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.431642 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e33da4-efdf-43eb-9bcb-dad33839bba4-kube-api-access-5rqzl" (OuterVolumeSpecName: "kube-api-access-5rqzl") pod "e1e33da4-efdf-43eb-9bcb-dad33839bba4" (UID: "e1e33da4-efdf-43eb-9bcb-dad33839bba4"). InnerVolumeSpecName "kube-api-access-5rqzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.473077 4751 generic.go:334] "Generic (PLEG): container finished" podID="3cafb339-3c19-48b7-8012-27f57e539659" containerID="298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9" exitCode=0 Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.473174 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shx2q" event={"ID":"3cafb339-3c19-48b7-8012-27f57e539659","Type":"ContainerDied","Data":"298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9"} Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.476714 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2881837cb62a7ebed9bde6ada6a7fa67d37422925470395380b6ba130615ab1a" Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.476764 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-8dvh7" Dec 03 15:28:36 crc kubenswrapper[4751]: I1203 15:28:36.519036 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rqzl\" (UniqueName: \"kubernetes.io/projected/e1e33da4-efdf-43eb-9bcb-dad33839bba4-kube-api-access-5rqzl\") on node \"crc\" DevicePath \"\"" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.325380 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e33da4-efdf-43eb-9bcb-dad33839bba4" path="/var/lib/kubelet/pods/e1e33da4-efdf-43eb-9bcb-dad33839bba4/volumes" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.702752 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jhrqk/crc-debug-94c2w"] Dec 03 15:28:37 crc kubenswrapper[4751]: E1203 15:28:37.703553 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e33da4-efdf-43eb-9bcb-dad33839bba4" containerName="container-00" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.703567 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e33da4-efdf-43eb-9bcb-dad33839bba4" containerName="container-00" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.703768 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e33da4-efdf-43eb-9bcb-dad33839bba4" containerName="container-00" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.704601 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.708375 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jhrqk"/"default-dockercfg-652g9" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.743816 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v49kx\" (UniqueName: \"kubernetes.io/projected/2f051a78-7f78-4b65-a884-09efdf2e8089-kube-api-access-v49kx\") pod \"crc-debug-94c2w\" (UID: \"2f051a78-7f78-4b65-a884-09efdf2e8089\") " pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.743915 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f051a78-7f78-4b65-a884-09efdf2e8089-host\") pod \"crc-debug-94c2w\" (UID: \"2f051a78-7f78-4b65-a884-09efdf2e8089\") " pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.846080 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v49kx\" (UniqueName: \"kubernetes.io/projected/2f051a78-7f78-4b65-a884-09efdf2e8089-kube-api-access-v49kx\") pod \"crc-debug-94c2w\" (UID: \"2f051a78-7f78-4b65-a884-09efdf2e8089\") " pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.846193 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f051a78-7f78-4b65-a884-09efdf2e8089-host\") pod \"crc-debug-94c2w\" (UID: \"2f051a78-7f78-4b65-a884-09efdf2e8089\") " pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:37 crc kubenswrapper[4751]: I1203 15:28:37.846352 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f051a78-7f78-4b65-a884-09efdf2e8089-host\") pod \"crc-debug-94c2w\" (UID: \"2f051a78-7f78-4b65-a884-09efdf2e8089\") " pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:38 crc kubenswrapper[4751]: I1203 15:28:38.351665 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v49kx\" (UniqueName: \"kubernetes.io/projected/2f051a78-7f78-4b65-a884-09efdf2e8089-kube-api-access-v49kx\") pod \"crc-debug-94c2w\" (UID: \"2f051a78-7f78-4b65-a884-09efdf2e8089\") " pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:38 crc kubenswrapper[4751]: I1203 15:28:38.626234 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:38 crc kubenswrapper[4751]: W1203 15:28:38.675986 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f051a78_7f78_4b65_a884_09efdf2e8089.slice/crio-074faf1932760897d1588c92257671f40aacad5803a974b295559bbac7d1d6e6 WatchSource:0}: Error finding container 074faf1932760897d1588c92257671f40aacad5803a974b295559bbac7d1d6e6: Status 404 returned error can't find the container with id 074faf1932760897d1588c92257671f40aacad5803a974b295559bbac7d1d6e6 Dec 03 15:28:39 crc kubenswrapper[4751]: I1203 15:28:39.529698 4751 generic.go:334] "Generic (PLEG): container finished" podID="3cafb339-3c19-48b7-8012-27f57e539659" containerID="4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c" exitCode=0 Dec 03 15:28:39 crc kubenswrapper[4751]: I1203 15:28:39.530974 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shx2q" event={"ID":"3cafb339-3c19-48b7-8012-27f57e539659","Type":"ContainerDied","Data":"4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c"} Dec 03 15:28:39 crc kubenswrapper[4751]: I1203 15:28:39.541165 4751 generic.go:334] "Generic (PLEG): container finished" podID="2f051a78-7f78-4b65-a884-09efdf2e8089" containerID="78496d5398fa63fde55fced1b9580943fb35baa18cd65bbbb7dc12bc068ed2e6" exitCode=0 Dec 03 15:28:39 crc kubenswrapper[4751]: I1203 15:28:39.541252 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/crc-debug-94c2w" event={"ID":"2f051a78-7f78-4b65-a884-09efdf2e8089","Type":"ContainerDied","Data":"78496d5398fa63fde55fced1b9580943fb35baa18cd65bbbb7dc12bc068ed2e6"} Dec 03 15:28:39 crc kubenswrapper[4751]: I1203 15:28:39.541296 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/crc-debug-94c2w" event={"ID":"2f051a78-7f78-4b65-a884-09efdf2e8089","Type":"ContainerStarted","Data":"074faf1932760897d1588c92257671f40aacad5803a974b295559bbac7d1d6e6"} Dec 03 15:28:39 crc kubenswrapper[4751]: I1203 15:28:39.614659 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jhrqk/crc-debug-94c2w"] Dec 03 15:28:39 crc kubenswrapper[4751]: I1203 15:28:39.629706 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jhrqk/crc-debug-94c2w"] Dec 03 15:28:40 crc kubenswrapper[4751]: I1203 15:28:40.557198 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shx2q" event={"ID":"3cafb339-3c19-48b7-8012-27f57e539659","Type":"ContainerStarted","Data":"612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3"} Dec 03 15:28:40 crc kubenswrapper[4751]: I1203 15:28:40.578772 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-shx2q" podStartSLOduration=3.078771569 podStartE2EDuration="6.578750438s" podCreationTimestamp="2025-12-03 15:28:34 +0000 UTC" firstStartedPulling="2025-12-03 15:28:36.476121789 +0000 UTC m=+4523.464477006" lastFinishedPulling="2025-12-03 15:28:39.976100658 +0000 UTC m=+4526.964455875" observedRunningTime="2025-12-03 15:28:40.575253864 +0000 UTC m=+4527.563609081" watchObservedRunningTime="2025-12-03 15:28:40.578750438 +0000 UTC m=+4527.567105655" Dec 03 15:28:40 crc kubenswrapper[4751]: I1203 15:28:40.690170 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:40 crc kubenswrapper[4751]: I1203 15:28:40.815881 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f051a78-7f78-4b65-a884-09efdf2e8089-host\") pod \"2f051a78-7f78-4b65-a884-09efdf2e8089\" (UID: \"2f051a78-7f78-4b65-a884-09efdf2e8089\") " Dec 03 15:28:40 crc kubenswrapper[4751]: I1203 15:28:40.816011 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f051a78-7f78-4b65-a884-09efdf2e8089-host" (OuterVolumeSpecName: "host") pod "2f051a78-7f78-4b65-a884-09efdf2e8089" (UID: "2f051a78-7f78-4b65-a884-09efdf2e8089"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:28:40 crc kubenswrapper[4751]: I1203 15:28:40.816059 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v49kx\" (UniqueName: \"kubernetes.io/projected/2f051a78-7f78-4b65-a884-09efdf2e8089-kube-api-access-v49kx\") pod \"2f051a78-7f78-4b65-a884-09efdf2e8089\" (UID: \"2f051a78-7f78-4b65-a884-09efdf2e8089\") " Dec 03 15:28:40 crc kubenswrapper[4751]: I1203 15:28:40.816742 4751 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f051a78-7f78-4b65-a884-09efdf2e8089-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:28:40 crc kubenswrapper[4751]: I1203 15:28:40.825317 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f051a78-7f78-4b65-a884-09efdf2e8089-kube-api-access-v49kx" (OuterVolumeSpecName: "kube-api-access-v49kx") pod "2f051a78-7f78-4b65-a884-09efdf2e8089" (UID: "2f051a78-7f78-4b65-a884-09efdf2e8089"). InnerVolumeSpecName "kube-api-access-v49kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:28:40 crc kubenswrapper[4751]: I1203 15:28:40.918417 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v49kx\" (UniqueName: \"kubernetes.io/projected/2f051a78-7f78-4b65-a884-09efdf2e8089-kube-api-access-v49kx\") on node \"crc\" DevicePath \"\"" Dec 03 15:28:41 crc kubenswrapper[4751]: I1203 15:28:41.332100 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f051a78-7f78-4b65-a884-09efdf2e8089" path="/var/lib/kubelet/pods/2f051a78-7f78-4b65-a884-09efdf2e8089/volumes" Dec 03 15:28:41 crc kubenswrapper[4751]: I1203 15:28:41.567366 4751 scope.go:117] "RemoveContainer" containerID="78496d5398fa63fde55fced1b9580943fb35baa18cd65bbbb7dc12bc068ed2e6" Dec 03 15:28:41 crc kubenswrapper[4751]: I1203 15:28:41.567376 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/crc-debug-94c2w" Dec 03 15:28:44 crc kubenswrapper[4751]: I1203 15:28:44.451268 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:44 crc kubenswrapper[4751]: I1203 15:28:44.451760 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:44 crc kubenswrapper[4751]: I1203 15:28:44.504196 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:54 crc kubenswrapper[4751]: I1203 15:28:54.503581 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:54 crc kubenswrapper[4751]: I1203 15:28:54.561180 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shx2q"] Dec 03 15:28:54 crc kubenswrapper[4751]: I1203 15:28:54.727506 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-shx2q" podUID="3cafb339-3c19-48b7-8012-27f57e539659" containerName="registry-server" containerID="cri-o://612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3" gracePeriod=2 Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.318555 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.364024 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-utilities\") pod \"3cafb339-3c19-48b7-8012-27f57e539659\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.364290 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-catalog-content\") pod \"3cafb339-3c19-48b7-8012-27f57e539659\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.364388 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl8xn\" (UniqueName: \"kubernetes.io/projected/3cafb339-3c19-48b7-8012-27f57e539659-kube-api-access-nl8xn\") pod \"3cafb339-3c19-48b7-8012-27f57e539659\" (UID: \"3cafb339-3c19-48b7-8012-27f57e539659\") " Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.365779 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-utilities" (OuterVolumeSpecName: "utilities") pod "3cafb339-3c19-48b7-8012-27f57e539659" (UID: "3cafb339-3c19-48b7-8012-27f57e539659"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.373244 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cafb339-3c19-48b7-8012-27f57e539659-kube-api-access-nl8xn" (OuterVolumeSpecName: "kube-api-access-nl8xn") pod "3cafb339-3c19-48b7-8012-27f57e539659" (UID: "3cafb339-3c19-48b7-8012-27f57e539659"). InnerVolumeSpecName "kube-api-access-nl8xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.424713 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cafb339-3c19-48b7-8012-27f57e539659" (UID: "3cafb339-3c19-48b7-8012-27f57e539659"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.467666 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl8xn\" (UniqueName: \"kubernetes.io/projected/3cafb339-3c19-48b7-8012-27f57e539659-kube-api-access-nl8xn\") on node \"crc\" DevicePath \"\"" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.467712 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.467726 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cafb339-3c19-48b7-8012-27f57e539659-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.750292 4751 generic.go:334] "Generic (PLEG): container finished" podID="3cafb339-3c19-48b7-8012-27f57e539659" containerID="612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3" exitCode=0 Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.750725 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shx2q" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.750749 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shx2q" event={"ID":"3cafb339-3c19-48b7-8012-27f57e539659","Type":"ContainerDied","Data":"612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3"} Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.751032 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shx2q" event={"ID":"3cafb339-3c19-48b7-8012-27f57e539659","Type":"ContainerDied","Data":"73cda4cb002cda260cd58a825666a61d0cd878bea7f9364c551aea7a4150aca1"} Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.751053 4751 scope.go:117] "RemoveContainer" containerID="612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.777587 4751 scope.go:117] "RemoveContainer" containerID="4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.797656 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shx2q"] Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.813130 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-shx2q"] Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.815755 4751 scope.go:117] "RemoveContainer" containerID="298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.887300 4751 scope.go:117] "RemoveContainer" containerID="612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3" Dec 03 15:28:55 crc kubenswrapper[4751]: E1203 15:28:55.897523 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3\": container with ID starting with 612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3 not found: ID does not exist" containerID="612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.897574 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3"} err="failed to get container status \"612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3\": rpc error: code = NotFound desc = could not find container \"612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3\": container with ID starting with 612fb5217394652644add39f7bd329c7a1174b23d2ff3be9e3b7a7c451dae0b3 not found: ID does not exist" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.897604 4751 scope.go:117] "RemoveContainer" containerID="4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c" Dec 03 15:28:55 crc kubenswrapper[4751]: E1203 15:28:55.903472 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c\": container with ID starting with 4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c not found: ID does not exist" containerID="4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.903534 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c"} err="failed to get container status \"4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c\": rpc error: code = NotFound desc = could not find container \"4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c\": container with ID starting with 4bca68312a961fccd5a644a50862e5ff2d89e6717e7d3b578e382df210045d8c not found: ID does not exist" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.903567 4751 scope.go:117] "RemoveContainer" containerID="298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9" Dec 03 15:28:55 crc kubenswrapper[4751]: E1203 15:28:55.908662 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9\": container with ID starting with 298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9 not found: ID does not exist" containerID="298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9" Dec 03 15:28:55 crc kubenswrapper[4751]: I1203 15:28:55.908712 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9"} err="failed to get container status \"298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9\": rpc error: code = NotFound desc = could not find container \"298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9\": container with ID starting with 298e6f85c75f8cbad272c099c7d9b4909a4738f18b72f637b04c5723029341e9 not found: ID does not exist" Dec 03 15:28:56 crc kubenswrapper[4751]: E1203 15:28:56.131552 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cafb339_3c19_48b7_8012_27f57e539659.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cafb339_3c19_48b7_8012_27f57e539659.slice/crio-73cda4cb002cda260cd58a825666a61d0cd878bea7f9364c551aea7a4150aca1\": RecentStats: unable to find data in memory cache]" Dec 03 15:28:57 crc kubenswrapper[4751]: I1203 15:28:57.330926 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cafb339-3c19-48b7-8012-27f57e539659" path="/var/lib/kubelet/pods/3cafb339-3c19-48b7-8012-27f57e539659/volumes" Dec 03 15:29:05 crc kubenswrapper[4751]: I1203 15:29:05.819848 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:29:05 crc kubenswrapper[4751]: I1203 15:29:05.820417 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:29:16 crc kubenswrapper[4751]: I1203 15:29:16.806139 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8a7a20c-88be-4cca-a10d-8ac9a898f090/init-config-reloader/0.log" Dec 03 15:29:16 crc kubenswrapper[4751]: I1203 15:29:16.961140 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8a7a20c-88be-4cca-a10d-8ac9a898f090/init-config-reloader/0.log" Dec 03 15:29:16 crc kubenswrapper[4751]: I1203 15:29:16.975030 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8a7a20c-88be-4cca-a10d-8ac9a898f090/config-reloader/0.log" Dec 03 15:29:17 crc kubenswrapper[4751]: I1203 15:29:17.062180 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a8a7a20c-88be-4cca-a10d-8ac9a898f090/alertmanager/0.log" Dec 03 15:29:17 crc kubenswrapper[4751]: I1203 15:29:17.185513 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b567465d6-ch8tf_1440ca21-e220-4178-b44c-06672479bc7c/barbican-api/0.log" Dec 03 15:29:17 crc kubenswrapper[4751]: I1203 15:29:17.326728 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b567465d6-ch8tf_1440ca21-e220-4178-b44c-06672479bc7c/barbican-api-log/0.log" Dec 03 15:29:17 crc kubenswrapper[4751]: I1203 15:29:17.587858 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c469d65bd-rdqn6_58307992-3054-4b05-b7c6-f768c2a1e849/barbican-keystone-listener/0.log" Dec 03 15:29:17 crc kubenswrapper[4751]: I1203 15:29:17.736811 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c469d65bd-rdqn6_58307992-3054-4b05-b7c6-f768c2a1e849/barbican-keystone-listener-log/0.log" Dec 03 15:29:17 crc kubenswrapper[4751]: I1203 15:29:17.768512 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5dcd655495-dj2gs_4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f/barbican-worker/0.log" Dec 03 15:29:17 crc kubenswrapper[4751]: I1203 15:29:17.860216 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5dcd655495-dj2gs_4cb2ff1e-e8e7-4af2-b8fa-073de9b9613f/barbican-worker-log/0.log" Dec 03 15:29:18 crc kubenswrapper[4751]: I1203 15:29:18.017600 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-klp6k_cd7a02af-abd1-4669-88f3-7e1d1117d8e9/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:18 crc kubenswrapper[4751]: I1203 15:29:18.242336 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/ceilometer-central-agent/1.log" Dec 03 15:29:18 crc kubenswrapper[4751]: I1203 15:29:18.316120 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/ceilometer-central-agent/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.038107 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/proxy-httpd/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.085538 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/ceilometer-notification-agent/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.097129 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/sg-core/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.128645 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ed7b699d-a3d7-4f1c-ad1f-6f0bc5194991/ceilometer-notification-agent/1.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.298121 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_afa6e4a5-811b-43db-868b-66a71bff4830/cinder-api-log/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.372023 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_afa6e4a5-811b-43db-868b-66a71bff4830/cinder-api/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.584229 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_28a23243-c107-4c51-96f0-82db8946b245/probe/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.603989 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_28a23243-c107-4c51-96f0-82db8946b245/cinder-scheduler/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.753035 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_b5b84593-a4d7-4b1c-843a-feb9273afbf4/cloudkitty-api/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.811828 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_b5b84593-a4d7-4b1c-843a-feb9273afbf4/cloudkitty-api-log/0.log" Dec 03 15:29:19 crc kubenswrapper[4751]: I1203 15:29:19.923819 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_34351ff3-ea5e-403c-9d04-ca6777287cff/loki-compactor/0.log" Dec 03 15:29:20 crc kubenswrapper[4751]: I1203 15:29:20.045903 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-56cd74f89f-xg9ml_e2d4448e-9181-494b-bec0-12da338b184d/loki-distributor/0.log" Dec 03 15:29:20 crc kubenswrapper[4751]: I1203 15:29:20.145676 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-8d88b_5a964492-a736-427e-b81a-d6d863d0eaaf/gateway/0.log" Dec 03 15:29:20 crc kubenswrapper[4751]: I1203 15:29:20.336339 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-bl8ws_c1c24fdf-0c9e-458f-9803-87e9d6c3161f/gateway/0.log" Dec 03 15:29:21 crc kubenswrapper[4751]: I1203 15:29:21.056451 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_85623735-2d6a-4d53-ac14-e4cd714ecc7b/loki-index-gateway/0.log" Dec 03 15:29:21 crc kubenswrapper[4751]: I1203 15:29:21.438070 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-779849886d-r2j44_ab871b52-8ec3-4a23-bce3-ee6e2e8c21fa/loki-query-frontend/0.log" Dec 03 15:29:21 crc kubenswrapper[4751]: I1203 15:29:21.642763 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_04053d51-dddf-43e3-a230-9ac729dec435/loki-ingester/0.log" Dec 03 15:29:21 crc kubenswrapper[4751]: I1203 15:29:21.962431 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6wwg6_d5de9d69-621e-4336-bd1d-e29c27d29430/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:22 crc kubenswrapper[4751]: I1203 15:29:22.319499 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-j2qfs_233a8db3-fc65-4c75-81d4-552f44ee95c2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:22 crc kubenswrapper[4751]: I1203 15:29:22.390264 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c4b758ff5-hjq5l_941e6cf3-002b-476c-8347-dfc11a32b067/init/0.log" Dec 03 15:29:22 crc kubenswrapper[4751]: I1203 15:29:22.590696 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-548665d79b-8226l_4797e85e-ad67-454b-b210-25f5481780c5/loki-querier/0.log" Dec 03 15:29:22 crc kubenswrapper[4751]: I1203 15:29:22.692955 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c4b758ff5-hjq5l_941e6cf3-002b-476c-8347-dfc11a32b067/init/0.log" Dec 03 15:29:22 crc kubenswrapper[4751]: I1203 15:29:22.773691 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c4b758ff5-hjq5l_941e6cf3-002b-476c-8347-dfc11a32b067/dnsmasq-dns/0.log" Dec 03 15:29:22 crc kubenswrapper[4751]: I1203 15:29:22.842313 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_9ae01ad1-cb7e-4bad-9505-57b4f85d7d3a/cloudkitty-proc/0.log" Dec 03 15:29:22 crc kubenswrapper[4751]: I1203 15:29:22.931217 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rt2j2_4f6373bc-f6a3-478f-92f5-8e311a5fd86c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:22 crc kubenswrapper[4751]: I1203 15:29:22.982709 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ac247305-666d-4241-b756-88499fd359ad/glance-httpd/0.log" Dec 03 15:29:23 crc kubenswrapper[4751]: I1203 15:29:23.085187 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ac247305-666d-4241-b756-88499fd359ad/glance-log/0.log" Dec 03 15:29:23 crc kubenswrapper[4751]: I1203 15:29:23.190256 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_15b1a633-766f-41f8-b9e8-22acc97bf4c8/glance-httpd/0.log" Dec 03 15:29:23 crc kubenswrapper[4751]: I1203 15:29:23.277971 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_15b1a633-766f-41f8-b9e8-22acc97bf4c8/glance-log/0.log" Dec 03 15:29:23 crc kubenswrapper[4751]: I1203 15:29:23.385030 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tk2xg_c1564210-8ace-4588-a706-3c7583ea0568/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:23 crc kubenswrapper[4751]: I1203 15:29:23.483925 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-blkvd_8eeba461-aadb-44d9-ac60-9413a2c70e6d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:23 crc kubenswrapper[4751]: I1203 15:29:23.691201 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412901-h5p7l_bab120ba-b67e-46bf-9d23-359d3119b904/keystone-cron/0.log" Dec 03 15:29:23 crc kubenswrapper[4751]: I1203 15:29:23.906998 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b1ba2fcd-b1a4-42c8-a3a1-f84f3e198de2/kube-state-metrics/0.log" Dec 03 15:29:23 crc kubenswrapper[4751]: I1203 15:29:23.971428 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7f9cb9cd-blmhg_491c0713-5024-484d-921d-387200cb08b2/keystone-api/0.log" Dec 03 15:29:24 crc kubenswrapper[4751]: I1203 15:29:24.033626 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-j6cvn_607ac64e-604b-407d-9939-b8f2ba0832c5/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:24 crc kubenswrapper[4751]: I1203 15:29:24.440057 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7bc6669df7-xxpxz_97f090c9-1ba2-45b8-9f01-c8372381b095/neutron-httpd/0.log" Dec 03 15:29:24 crc kubenswrapper[4751]: I1203 15:29:24.500804 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-8bbll_521c9f69-c59e-4b93-a1a2-ab687b7ee6eb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:24 crc kubenswrapper[4751]: I1203 15:29:24.609034 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7bc6669df7-xxpxz_97f090c9-1ba2-45b8-9f01-c8372381b095/neutron-api/0.log" Dec 03 15:29:25 crc kubenswrapper[4751]: I1203 15:29:25.215515 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d/nova-api-log/0.log" Dec 03 15:29:25 crc kubenswrapper[4751]: I1203 15:29:25.287681 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bac48c68-2c8f-47ff-8f11-7974913dbac1/nova-cell0-conductor-conductor/0.log" Dec 03 15:29:25 crc kubenswrapper[4751]: I1203 15:29:25.693566 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_730eedc8-ac64-4f53-80d0-ec824459f08c/nova-cell1-conductor-conductor/0.log" Dec 03 15:29:25 crc kubenswrapper[4751]: I1203 15:29:25.742450 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d1b0dd29-cb5d-46d1-b089-e95e3bb77a5d/nova-api-api/0.log" Dec 03 15:29:25 crc kubenswrapper[4751]: I1203 15:29:25.751240 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fa35ea33-1dc0-4569-9052-36e722f491c1/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 15:29:26 crc kubenswrapper[4751]: I1203 15:29:26.030871 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-vgvng_a2deeaad-edf9-4d9c-b116-9a31587b1b2a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:26 crc kubenswrapper[4751]: I1203 15:29:26.225530 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_47692780-6643-491b-8d92-c181c82d4ce6/nova-metadata-log/0.log" Dec 03 15:29:26 crc kubenswrapper[4751]: I1203 15:29:26.548950 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_dbe3db84-a6ac-4b03-999b-1d2663641afa/nova-scheduler-scheduler/0.log" Dec 03 15:29:26 crc kubenswrapper[4751]: I1203 15:29:26.584946 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3dc63449-cac9-48bc-abb7-3ff350a408cf/mysql-bootstrap/0.log" Dec 03 15:29:26 crc kubenswrapper[4751]: I1203 15:29:26.782423 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3dc63449-cac9-48bc-abb7-3ff350a408cf/mysql-bootstrap/0.log" Dec 03 15:29:26 crc kubenswrapper[4751]: I1203 15:29:26.845917 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3dc63449-cac9-48bc-abb7-3ff350a408cf/galera/0.log" Dec 03 15:29:27 crc kubenswrapper[4751]: I1203 15:29:27.033049 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a45965be-01f0-4c6d-9db8-08b5e5564c5a/mysql-bootstrap/0.log" Dec 03 15:29:27 crc kubenswrapper[4751]: I1203 15:29:27.203938 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a45965be-01f0-4c6d-9db8-08b5e5564c5a/mysql-bootstrap/0.log" Dec 03 15:29:27 crc kubenswrapper[4751]: I1203 15:29:27.276120 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a45965be-01f0-4c6d-9db8-08b5e5564c5a/galera/0.log" Dec 03 15:29:27 crc kubenswrapper[4751]: I1203 15:29:27.429746 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_033a5c7c-11ef-4610-ac41-aa8471a9f0b4/openstackclient/0.log" Dec 03 15:29:27 crc kubenswrapper[4751]: I1203 15:29:27.585130 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lqzrd_7ab1fa90-b8eb-405d-803d-b9fd84939289/ovn-controller/0.log" Dec 03 15:29:27 crc kubenswrapper[4751]: I1203 15:29:27.975905 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-98hst_aa72a067-0544-4a0c-8750-c3d76221d4f2/openstack-network-exporter/0.log" Dec 03 15:29:28 crc kubenswrapper[4751]: I1203 15:29:28.041860 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_47692780-6643-491b-8d92-c181c82d4ce6/nova-metadata-metadata/0.log" Dec 03 15:29:28 crc kubenswrapper[4751]: I1203 15:29:28.158231 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzz9c_3faee7be-8b53-42b6-90fd-ba62998f9ced/ovsdb-server-init/0.log" Dec 03 15:29:28 crc kubenswrapper[4751]: I1203 15:29:28.320135 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzz9c_3faee7be-8b53-42b6-90fd-ba62998f9ced/ovs-vswitchd/0.log" Dec 03 15:29:28 crc kubenswrapper[4751]: I1203 15:29:28.350864 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzz9c_3faee7be-8b53-42b6-90fd-ba62998f9ced/ovsdb-server-init/0.log" Dec 03 15:29:28 crc kubenswrapper[4751]: I1203 15:29:28.480813 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzz9c_3faee7be-8b53-42b6-90fd-ba62998f9ced/ovsdb-server/0.log" Dec 03 15:29:28 crc kubenswrapper[4751]: I1203 15:29:28.696930 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-h82lf_5d9c6feb-6018-476a-b029-e4df05b4566d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:28 crc kubenswrapper[4751]: I1203 15:29:28.774363 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e7ffebbc-a033-4a04-a133-d90456a57881/openstack-network-exporter/0.log" Dec 03 15:29:28 crc kubenswrapper[4751]: I1203 15:29:28.801243 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e7ffebbc-a033-4a04-a133-d90456a57881/ovn-northd/0.log" Dec 03 15:29:29 crc kubenswrapper[4751]: I1203 15:29:29.054917 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b/openstack-network-exporter/0.log" Dec 03 15:29:29 crc kubenswrapper[4751]: I1203 15:29:29.072882 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c0a4db08-d9ca-4d0a-aad6-33dd6f300c3b/ovsdbserver-nb/0.log" Dec 03 15:29:29 crc kubenswrapper[4751]: I1203 15:29:29.585954 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0046f111-cf94-402b-8981-659978aace04/openstack-network-exporter/0.log" Dec 03 15:29:29 crc kubenswrapper[4751]: I1203 15:29:29.613344 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0046f111-cf94-402b-8981-659978aace04/ovsdbserver-sb/0.log" Dec 03 15:29:29 crc kubenswrapper[4751]: I1203 15:29:29.716763 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-cfc86c59b-x4m2l_bfdc9703-a6a9-4a1d-81f4-852aa9167a17/placement-api/0.log" Dec 03 15:29:29 crc kubenswrapper[4751]: I1203 15:29:29.872743 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/init-config-reloader/0.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.048414 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-cfc86c59b-x4m2l_bfdc9703-a6a9-4a1d-81f4-852aa9167a17/placement-log/0.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.213087 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/config-reloader/0.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.264621 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/init-config-reloader/0.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.309729 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/prometheus/0.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.323418 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/prometheus/1.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.477320 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b40b7285-42c6-4278-8d86-69847e549907/thanos-sidecar/0.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.510957 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d5fd5425-70e4-4a79-8ea7-3326cae3908d/setup-container/0.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.827856 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d5fd5425-70e4-4a79-8ea7-3326cae3908d/setup-container/0.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.847845 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4760c776-9212-42af-8bf2-928c79417922/setup-container/0.log" Dec 03 15:29:30 crc kubenswrapper[4751]: I1203 15:29:30.856918 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d5fd5425-70e4-4a79-8ea7-3326cae3908d/rabbitmq/0.log" Dec 03 15:29:31 crc kubenswrapper[4751]: I1203 15:29:31.599436 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4760c776-9212-42af-8bf2-928c79417922/setup-container/0.log" Dec 03 15:29:31 crc kubenswrapper[4751]: I1203 15:29:31.797549 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4760c776-9212-42af-8bf2-928c79417922/rabbitmq/0.log" Dec 03 15:29:31 crc kubenswrapper[4751]: I1203 15:29:31.852279 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-8cqwk_a08bb04e-0d05-4153-ab50-9fde15bb421b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:32 crc kubenswrapper[4751]: I1203 15:29:32.035870 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-w96tn_228ac9f7-0635-4a38-8d51-038e9a588a7d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:32 crc kubenswrapper[4751]: I1203 15:29:32.096479 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bhdkt_f38ae118-11a0-4c72-9d0b-750762779ee7/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:32 crc kubenswrapper[4751]: I1203 15:29:32.331758 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-2gj64_478701d4-170a-4043-97d6-6b54b753a72a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:32 crc kubenswrapper[4751]: I1203 15:29:32.466749 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-6xdd9_b8584e24-f4eb-400e-a73c-610ba6fe3a41/ssh-known-hosts-edpm-deployment/0.log" Dec 03 15:29:32 crc kubenswrapper[4751]: I1203 15:29:32.710344 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-84ff798d87-5c96l_9c7e0fc7-03ed-4002-b460-df87d151f563/proxy-server/0.log" Dec 03 15:29:32 crc kubenswrapper[4751]: I1203 15:29:32.864390 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-68rzl_0fedaa81-0c36-44fa-ab7b-b712759fc8d4/swift-ring-rebalance/0.log" Dec 03 15:29:32 crc kubenswrapper[4751]: I1203 15:29:32.894493 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-84ff798d87-5c96l_9c7e0fc7-03ed-4002-b460-df87d151f563/proxy-httpd/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.049910 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/account-auditor/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.124154 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/account-reaper/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.174162 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/account-replicator/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.256003 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/account-server/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.260585 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/container-auditor/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.414880 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/container-server/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.416502 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/container-replicator/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.533549 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/container-updater/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.572647 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-auditor/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.647816 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-expirer/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.737306 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-replicator/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.750117 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-server/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.829547 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/object-updater/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.905164 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/rsync/0.log" Dec 03 15:29:33 crc kubenswrapper[4751]: I1203 15:29:33.983067 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a5f48177-6ece-47b6-9c2c-2dd1e6a2dfd2/swift-recon-cron/0.log" Dec 03 15:29:34 crc kubenswrapper[4751]: I1203 15:29:34.205142 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xcn2l_052552e4-436a-4f3e-a7cd-cacb72ff4f16/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:34 crc kubenswrapper[4751]: I1203 15:29:34.342791 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_aa32bbce-059c-46e3-a8d7-f737d93e394e/tempest-tests-tempest-tests-runner/0.log" Dec 03 15:29:34 crc kubenswrapper[4751]: I1203 15:29:34.473734 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_14cd5bf7-18fc-450b-b4b1-f87bf154efeb/test-operator-logs-container/0.log" Dec 03 15:29:34 crc kubenswrapper[4751]: I1203 15:29:34.622885 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-g2m8z_ae9b241a-73d5-4f1c-b14d-f7b44cc008f1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:29:35 crc kubenswrapper[4751]: I1203 15:29:35.820016 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:29:35 crc kubenswrapper[4751]: I1203 15:29:35.820104 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:29:35 crc kubenswrapper[4751]: I1203 15:29:35.820142 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 15:29:35 crc kubenswrapper[4751]: I1203 15:29:35.820697 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:29:35 crc kubenswrapper[4751]: I1203 15:29:35.820762 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" gracePeriod=600 Dec 03 15:29:36 crc kubenswrapper[4751]: I1203 15:29:36.209943 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" exitCode=0 Dec 03 15:29:36 crc kubenswrapper[4751]: I1203 15:29:36.210223 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642"} Dec 03 15:29:36 crc kubenswrapper[4751]: I1203 15:29:36.210256 4751 scope.go:117] "RemoveContainer" containerID="a0be2e6b634e60ad28b976fe586002efe94eaf26e24cfc54b5fd477b10cdbf97" Dec 03 15:29:36 crc kubenswrapper[4751]: E1203 15:29:36.445508 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:29:37 crc kubenswrapper[4751]: I1203 15:29:37.244432 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:29:37 crc kubenswrapper[4751]: E1203 15:29:37.244697 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:29:44 crc kubenswrapper[4751]: I1203 15:29:44.075028 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_05d18e1b-04cd-4b4a-a728-bdbc9c2ab713/memcached/0.log" Dec 03 15:29:49 crc kubenswrapper[4751]: I1203 15:29:49.314000 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:29:49 crc kubenswrapper[4751]: E1203 15:29:49.315987 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.147784 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr"] Dec 03 15:30:00 crc kubenswrapper[4751]: E1203 15:30:00.149932 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f051a78-7f78-4b65-a884-09efdf2e8089" containerName="container-00" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.150028 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f051a78-7f78-4b65-a884-09efdf2e8089" containerName="container-00" Dec 03 15:30:00 crc kubenswrapper[4751]: E1203 15:30:00.150092 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cafb339-3c19-48b7-8012-27f57e539659" containerName="extract-utilities" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.150151 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cafb339-3c19-48b7-8012-27f57e539659" containerName="extract-utilities" Dec 03 15:30:00 crc kubenswrapper[4751]: E1203 15:30:00.150215 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cafb339-3c19-48b7-8012-27f57e539659" containerName="extract-content" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.150273 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cafb339-3c19-48b7-8012-27f57e539659" containerName="extract-content" Dec 03 15:30:00 crc kubenswrapper[4751]: E1203 15:30:00.150377 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cafb339-3c19-48b7-8012-27f57e539659" containerName="registry-server" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.150462 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cafb339-3c19-48b7-8012-27f57e539659" containerName="registry-server" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.150758 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f051a78-7f78-4b65-a884-09efdf2e8089" containerName="container-00" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.150836 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cafb339-3c19-48b7-8012-27f57e539659" containerName="registry-server" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.151659 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.153636 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.153830 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.167085 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr"] Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.199743 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e48920f-198b-4af0-aff3-4cb332b6f350-secret-volume\") pod \"collect-profiles-29412930-m6fvr\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.200053 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x94p7\" (UniqueName: \"kubernetes.io/projected/0e48920f-198b-4af0-aff3-4cb332b6f350-kube-api-access-x94p7\") pod \"collect-profiles-29412930-m6fvr\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.200356 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e48920f-198b-4af0-aff3-4cb332b6f350-config-volume\") pod \"collect-profiles-29412930-m6fvr\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.302442 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e48920f-198b-4af0-aff3-4cb332b6f350-secret-volume\") pod \"collect-profiles-29412930-m6fvr\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.302537 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x94p7\" (UniqueName: \"kubernetes.io/projected/0e48920f-198b-4af0-aff3-4cb332b6f350-kube-api-access-x94p7\") pod \"collect-profiles-29412930-m6fvr\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.302625 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e48920f-198b-4af0-aff3-4cb332b6f350-config-volume\") pod \"collect-profiles-29412930-m6fvr\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.303999 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e48920f-198b-4af0-aff3-4cb332b6f350-config-volume\") pod \"collect-profiles-29412930-m6fvr\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.310009 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e48920f-198b-4af0-aff3-4cb332b6f350-secret-volume\") pod \"collect-profiles-29412930-m6fvr\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.332686 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x94p7\" (UniqueName: \"kubernetes.io/projected/0e48920f-198b-4af0-aff3-4cb332b6f350-kube-api-access-x94p7\") pod \"collect-profiles-29412930-m6fvr\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:00 crc kubenswrapper[4751]: I1203 15:30:00.484976 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:01 crc kubenswrapper[4751]: I1203 15:30:01.376991 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr"] Dec 03 15:30:01 crc kubenswrapper[4751]: I1203 15:30:01.516602 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" event={"ID":"0e48920f-198b-4af0-aff3-4cb332b6f350","Type":"ContainerStarted","Data":"d56e0ff672c49118000bb2cade6764a921d48c72238a897cef9b4f45275fbcb0"} Dec 03 15:30:02 crc kubenswrapper[4751]: I1203 15:30:02.528638 4751 generic.go:334] "Generic (PLEG): container finished" podID="0e48920f-198b-4af0-aff3-4cb332b6f350" containerID="46490b222299987dc5311aaa4c2d13d8e2c82553b37a439e7fe89c8dce9c91cc" exitCode=0 Dec 03 15:30:02 crc kubenswrapper[4751]: I1203 15:30:02.528707 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" event={"ID":"0e48920f-198b-4af0-aff3-4cb332b6f350","Type":"ContainerDied","Data":"46490b222299987dc5311aaa4c2d13d8e2c82553b37a439e7fe89c8dce9c91cc"} Dec 03 15:30:03 crc kubenswrapper[4751]: I1203 15:30:03.326728 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:30:03 crc kubenswrapper[4751]: E1203 15:30:03.327296 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.072397 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.209843 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e48920f-198b-4af0-aff3-4cb332b6f350-secret-volume\") pod \"0e48920f-198b-4af0-aff3-4cb332b6f350\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.209967 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x94p7\" (UniqueName: \"kubernetes.io/projected/0e48920f-198b-4af0-aff3-4cb332b6f350-kube-api-access-x94p7\") pod \"0e48920f-198b-4af0-aff3-4cb332b6f350\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.210041 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e48920f-198b-4af0-aff3-4cb332b6f350-config-volume\") pod \"0e48920f-198b-4af0-aff3-4cb332b6f350\" (UID: \"0e48920f-198b-4af0-aff3-4cb332b6f350\") " Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.211228 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e48920f-198b-4af0-aff3-4cb332b6f350-config-volume" (OuterVolumeSpecName: "config-volume") pod "0e48920f-198b-4af0-aff3-4cb332b6f350" (UID: "0e48920f-198b-4af0-aff3-4cb332b6f350"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.217489 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e48920f-198b-4af0-aff3-4cb332b6f350-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0e48920f-198b-4af0-aff3-4cb332b6f350" (UID: "0e48920f-198b-4af0-aff3-4cb332b6f350"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.219286 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e48920f-198b-4af0-aff3-4cb332b6f350-kube-api-access-x94p7" (OuterVolumeSpecName: "kube-api-access-x94p7") pod "0e48920f-198b-4af0-aff3-4cb332b6f350" (UID: "0e48920f-198b-4af0-aff3-4cb332b6f350"). InnerVolumeSpecName "kube-api-access-x94p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.312579 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x94p7\" (UniqueName: \"kubernetes.io/projected/0e48920f-198b-4af0-aff3-4cb332b6f350-kube-api-access-x94p7\") on node \"crc\" DevicePath \"\"" Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.312629 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e48920f-198b-4af0-aff3-4cb332b6f350-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.312644 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e48920f-198b-4af0-aff3-4cb332b6f350-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.563996 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" event={"ID":"0e48920f-198b-4af0-aff3-4cb332b6f350","Type":"ContainerDied","Data":"d56e0ff672c49118000bb2cade6764a921d48c72238a897cef9b4f45275fbcb0"} Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.564073 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d56e0ff672c49118000bb2cade6764a921d48c72238a897cef9b4f45275fbcb0" Dec 03 15:30:04 crc kubenswrapper[4751]: I1203 15:30:04.564101 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412930-m6fvr" Dec 03 15:30:05 crc kubenswrapper[4751]: I1203 15:30:05.174237 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt"] Dec 03 15:30:05 crc kubenswrapper[4751]: I1203 15:30:05.183408 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412885-mhqpt"] Dec 03 15:30:05 crc kubenswrapper[4751]: I1203 15:30:05.328140 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4277c05-1792-41f8-af0f-3403799bb1e5" path="/var/lib/kubelet/pods/c4277c05-1792-41f8-af0f-3403799bb1e5/volumes" Dec 03 15:30:08 crc kubenswrapper[4751]: I1203 15:30:08.878035 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/util/0.log" Dec 03 15:30:09 crc kubenswrapper[4751]: I1203 15:30:09.117216 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/util/0.log" Dec 03 15:30:09 crc kubenswrapper[4751]: I1203 15:30:09.117630 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/pull/0.log" Dec 03 15:30:09 crc kubenswrapper[4751]: I1203 15:30:09.159590 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/pull/0.log" Dec 03 15:30:09 crc kubenswrapper[4751]: I1203 15:30:09.571466 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/extract/0.log" Dec 03 15:30:09 crc kubenswrapper[4751]: I1203 15:30:09.598917 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/util/0.log" Dec 03 15:30:09 crc kubenswrapper[4751]: I1203 15:30:09.610292 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8af69667a7a737f651a971b3e5784a713ec4a7a6c6070f3b32abb0530lcgjg_86ffc8f7-ced7-45c6-8c51-3f8301a0e8d0/pull/0.log" Dec 03 15:30:09 crc kubenswrapper[4751]: I1203 15:30:09.859475 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qswcx_26689286-a791-485e-b442-9e399ae7a79b/kube-rbac-proxy/0.log" Dec 03 15:30:09 crc kubenswrapper[4751]: I1203 15:30:09.908847 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qswcx_26689286-a791-485e-b442-9e399ae7a79b/manager/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.028132 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-7422h_b112bf8e-175b-4bc3-9840-6d134b4a1bce/kube-rbac-proxy/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.125316 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-7422h_b112bf8e-175b-4bc3-9840-6d134b4a1bce/manager/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.234565 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-n8td8_618af04c-a37d-4d21-bdba-345c9a63be07/kube-rbac-proxy/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.305076 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-n8td8_618af04c-a37d-4d21-bdba-345c9a63be07/manager/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.475806 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-6shnx_a975003d-b7d2-4a95-8571-571bc082021d/kube-rbac-proxy/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.590266 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-6shnx_a975003d-b7d2-4a95-8571-571bc082021d/manager/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.648549 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-m4q9z_ff012f7f-3431-472a-8b44-1fa7a47e74e1/kube-rbac-proxy/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.746636 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-m4q9z_ff012f7f-3431-472a-8b44-1fa7a47e74e1/manager/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.838312 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8hhdk_b5d6b394-fe97-4e70-9916-9c6791379931/kube-rbac-proxy/0.log" Dec 03 15:30:10 crc kubenswrapper[4751]: I1203 15:30:10.905485 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8hhdk_b5d6b394-fe97-4e70-9916-9c6791379931/manager/1.log" Dec 03 15:30:11 crc kubenswrapper[4751]: I1203 15:30:11.025607 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8hhdk_b5d6b394-fe97-4e70-9916-9c6791379931/manager/0.log" Dec 03 15:30:11 crc kubenswrapper[4751]: I1203 15:30:11.087233 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-ppb75_a54985ea-4d23-4a65-bd1a-1c9d059ea206/kube-rbac-proxy/0.log" Dec 03 15:30:11 crc kubenswrapper[4751]: I1203 15:30:11.351349 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-ppb75_a54985ea-4d23-4a65-bd1a-1c9d059ea206/manager/0.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.053162 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-pxjk5_2f31e262-8f03-4689-bc29-5d9d8b33a2cc/kube-rbac-proxy/0.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.089387 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-pxjk5_2f31e262-8f03-4689-bc29-5d9d8b33a2cc/manager/1.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.124238 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-pxjk5_2f31e262-8f03-4689-bc29-5d9d8b33a2cc/manager/0.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.311159 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-wgjr8_7f29786e-1f3c-4c92-81ac-4b6110cf03a3/kube-rbac-proxy/0.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.430058 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-wgjr8_7f29786e-1f3c-4c92-81ac-4b6110cf03a3/manager/0.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.554710 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zjnwr_8adbadf1-f21d-4a09-acf7-d44a87bee356/kube-rbac-proxy/0.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.565725 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zjnwr_8adbadf1-f21d-4a09-acf7-d44a87bee356/manager/0.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.830164 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4nvpk_b4cb50e3-a93e-49b0-ac9c-6551046dc0be/kube-rbac-proxy/0.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.842207 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4nvpk_b4cb50e3-a93e-49b0-ac9c-6551046dc0be/manager/0.log" Dec 03 15:30:12 crc kubenswrapper[4751]: I1203 15:30:12.855042 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-sxl98_4cd34243-8404-4cf7-9185-c012700b5814/kube-rbac-proxy/0.log" Dec 03 15:30:13 crc kubenswrapper[4751]: I1203 15:30:13.207308 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-sxl98_4cd34243-8404-4cf7-9185-c012700b5814/manager/0.log" Dec 03 15:30:13 crc kubenswrapper[4751]: I1203 15:30:13.305248 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7tsfx_cbafd52a-d603-4b8c-a056-9a2a749bee21/kube-rbac-proxy/0.log" Dec 03 15:30:13 crc kubenswrapper[4751]: I1203 15:30:13.394592 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7tsfx_cbafd52a-d603-4b8c-a056-9a2a749bee21/manager/0.log" Dec 03 15:30:14 crc kubenswrapper[4751]: I1203 15:30:14.022215 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-8h62v_5c3add92-6cee-4980-903f-692cfd4cf87c/kube-rbac-proxy/0.log" Dec 03 15:30:14 crc kubenswrapper[4751]: I1203 15:30:14.063544 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-8h62v_5c3add92-6cee-4980-903f-692cfd4cf87c/manager/1.log" Dec 03 15:30:14 crc kubenswrapper[4751]: I1203 15:30:14.105119 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-8h62v_5c3add92-6cee-4980-903f-692cfd4cf87c/manager/0.log" Dec 03 15:30:14 crc kubenswrapper[4751]: I1203 15:30:14.226851 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b_17b09c23-21ca-4060-840d-acbf71e22d55/kube-rbac-proxy/0.log" Dec 03 15:30:14 crc kubenswrapper[4751]: I1203 15:30:14.276867 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b_17b09c23-21ca-4060-840d-acbf71e22d55/manager/1.log" Dec 03 15:30:14 crc kubenswrapper[4751]: I1203 15:30:14.360603 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4kd62b_17b09c23-21ca-4060-840d-acbf71e22d55/manager/0.log" Dec 03 15:30:14 crc kubenswrapper[4751]: I1203 15:30:14.668754 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jqt9t_8a1c208b-28d2-4d51-a98e-ffece8c3d11e/registry-server/0.log" Dec 03 15:30:14 crc kubenswrapper[4751]: I1203 15:30:14.755296 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-698cb7586c-qft9p_6aeb43b5-b817-4d39-81de-bc6f27afb55b/operator/0.log" Dec 03 15:30:14 crc kubenswrapper[4751]: I1203 15:30:14.977742 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-k5j8b_75a825ba-e08d-440f-866d-d32d2ae812f1/kube-rbac-proxy/0.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.000316 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gsp46_5434f233-b204-4db9-a93d-93d4342e4514/kube-rbac-proxy/0.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.054042 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-k5j8b_75a825ba-e08d-440f-866d-d32d2ae812f1/manager/0.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.253218 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-gsp46_5434f233-b204-4db9-a93d-93d4342e4514/manager/0.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.337499 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4thwz_ac5fb8ca-3372-4c92-a4d2-9ff4b543f94d/operator/0.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.504887 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-w9fcj_81e287a7-6973-4561-a67a-a8783b0cedf5/kube-rbac-proxy/0.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.622179 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-w9fcj_81e287a7-6973-4561-a67a-a8783b0cedf5/manager/0.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.626957 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-9598fff97-l6gxb_2011fe70-e44a-4b63-8064-e3234a639fb8/kube-rbac-proxy/0.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.816645 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7f9b9ccb84-v5t4x_7a4eb3e2-25fa-43e4-9e49-135b5c087014/manager/0.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.906215 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lgjvh_adff5e75-192d-4a27-a477-aa74dab8dd95/manager/1.log" Dec 03 15:30:15 crc kubenswrapper[4751]: I1203 15:30:15.906735 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lgjvh_adff5e75-192d-4a27-a477-aa74dab8dd95/kube-rbac-proxy/0.log" Dec 03 15:30:16 crc kubenswrapper[4751]: I1203 15:30:16.120020 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lgjvh_adff5e75-192d-4a27-a477-aa74dab8dd95/manager/0.log" Dec 03 15:30:16 crc kubenswrapper[4751]: I1203 15:30:16.172944 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-qgvqn_5950fcf6-2983-4341-ba48-12c27801a57e/kube-rbac-proxy/0.log" Dec 03 15:30:16 crc kubenswrapper[4751]: I1203 15:30:16.188315 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-9598fff97-l6gxb_2011fe70-e44a-4b63-8064-e3234a639fb8/manager/0.log" Dec 03 15:30:16 crc kubenswrapper[4751]: I1203 15:30:16.219103 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-qgvqn_5950fcf6-2983-4341-ba48-12c27801a57e/manager/0.log" Dec 03 15:30:18 crc kubenswrapper[4751]: I1203 15:30:18.314743 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:30:18 crc kubenswrapper[4751]: E1203 15:30:18.315448 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:30:30 crc kubenswrapper[4751]: I1203 15:30:30.007228 4751 scope.go:117] "RemoveContainer" containerID="d020baba10dfc88815a4f4ead45aec684e30d56934bac92135269776acd1226f" Dec 03 15:30:30 crc kubenswrapper[4751]: I1203 15:30:30.899355 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gst75"] Dec 03 15:30:30 crc kubenswrapper[4751]: E1203 15:30:30.900286 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e48920f-198b-4af0-aff3-4cb332b6f350" containerName="collect-profiles" Dec 03 15:30:30 crc kubenswrapper[4751]: I1203 15:30:30.900312 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e48920f-198b-4af0-aff3-4cb332b6f350" containerName="collect-profiles" Dec 03 15:30:30 crc kubenswrapper[4751]: I1203 15:30:30.900650 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e48920f-198b-4af0-aff3-4cb332b6f350" containerName="collect-profiles" Dec 03 15:30:30 crc kubenswrapper[4751]: I1203 15:30:30.902725 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:30 crc kubenswrapper[4751]: I1203 15:30:30.912307 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gst75"] Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.025024 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-utilities\") pod \"redhat-operators-gst75\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.025240 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s245\" (UniqueName: \"kubernetes.io/projected/f550ad55-18d0-438c-8519-27ae87d3bf4c-kube-api-access-6s245\") pod \"redhat-operators-gst75\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.025463 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-catalog-content\") pod \"redhat-operators-gst75\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.128121 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-utilities\") pod \"redhat-operators-gst75\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.128294 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s245\" (UniqueName: \"kubernetes.io/projected/f550ad55-18d0-438c-8519-27ae87d3bf4c-kube-api-access-6s245\") pod \"redhat-operators-gst75\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.128456 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-catalog-content\") pod \"redhat-operators-gst75\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.128773 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-utilities\") pod \"redhat-operators-gst75\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.128859 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-catalog-content\") pod \"redhat-operators-gst75\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.158520 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s245\" (UniqueName: \"kubernetes.io/projected/f550ad55-18d0-438c-8519-27ae87d3bf4c-kube-api-access-6s245\") pod \"redhat-operators-gst75\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.233883 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.314658 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:30:31 crc kubenswrapper[4751]: E1203 15:30:31.314905 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:30:31 crc kubenswrapper[4751]: I1203 15:30:31.830640 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gst75"] Dec 03 15:30:32 crc kubenswrapper[4751]: W1203 15:30:32.551818 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf550ad55_18d0_438c_8519_27ae87d3bf4c.slice/crio-6951a67d47697000ebe59342e5c6d1b667614910b74b7fe4c34cea4c938b3490 WatchSource:0}: Error finding container 6951a67d47697000ebe59342e5c6d1b667614910b74b7fe4c34cea4c938b3490: Status 404 returned error can't find the container with id 6951a67d47697000ebe59342e5c6d1b667614910b74b7fe4c34cea4c938b3490 Dec 03 15:30:32 crc kubenswrapper[4751]: I1203 15:30:32.855432 4751 generic.go:334] "Generic (PLEG): container finished" podID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerID="6bc87d8ae62dc955cbbca23d5722cfd487472f5699289db434b257d1753cac6d" exitCode=0 Dec 03 15:30:32 crc kubenswrapper[4751]: I1203 15:30:32.855488 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gst75" event={"ID":"f550ad55-18d0-438c-8519-27ae87d3bf4c","Type":"ContainerDied","Data":"6bc87d8ae62dc955cbbca23d5722cfd487472f5699289db434b257d1753cac6d"} Dec 03 15:30:32 crc kubenswrapper[4751]: I1203 15:30:32.855517 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gst75" event={"ID":"f550ad55-18d0-438c-8519-27ae87d3bf4c","Type":"ContainerStarted","Data":"6951a67d47697000ebe59342e5c6d1b667614910b74b7fe4c34cea4c938b3490"} Dec 03 15:30:33 crc kubenswrapper[4751]: I1203 15:30:33.890075 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gst75" event={"ID":"f550ad55-18d0-438c-8519-27ae87d3bf4c","Type":"ContainerStarted","Data":"4e8c0be2f78cf767b731278c5fa878148832244d6f29752a48e1ae1833388921"} Dec 03 15:30:37 crc kubenswrapper[4751]: I1203 15:30:37.933164 4751 generic.go:334] "Generic (PLEG): container finished" podID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerID="4e8c0be2f78cf767b731278c5fa878148832244d6f29752a48e1ae1833388921" exitCode=0 Dec 03 15:30:37 crc kubenswrapper[4751]: I1203 15:30:37.933320 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gst75" event={"ID":"f550ad55-18d0-438c-8519-27ae87d3bf4c","Type":"ContainerDied","Data":"4e8c0be2f78cf767b731278c5fa878148832244d6f29752a48e1ae1833388921"} Dec 03 15:30:38 crc kubenswrapper[4751]: I1203 15:30:38.947774 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gst75" event={"ID":"f550ad55-18d0-438c-8519-27ae87d3bf4c","Type":"ContainerStarted","Data":"613e3da38951b1ed47e8927524e9be9560d46c395afab0d4cb5320d8cb8e798d"} Dec 03 15:30:38 crc kubenswrapper[4751]: I1203 15:30:38.985092 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gst75" podStartSLOduration=3.493964668 podStartE2EDuration="8.985069542s" podCreationTimestamp="2025-12-03 15:30:30 +0000 UTC" firstStartedPulling="2025-12-03 15:30:32.864204271 +0000 UTC m=+4639.852559488" lastFinishedPulling="2025-12-03 15:30:38.355309145 +0000 UTC m=+4645.343664362" observedRunningTime="2025-12-03 15:30:38.979630166 +0000 UTC m=+4645.967985403" watchObservedRunningTime="2025-12-03 15:30:38.985069542 +0000 UTC m=+4645.973424759" Dec 03 15:30:40 crc kubenswrapper[4751]: I1203 15:30:40.041872 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pppn7_89c33472-8c62-4a71-9b17-697f9a0bbc65/control-plane-machine-set-operator/0.log" Dec 03 15:30:40 crc kubenswrapper[4751]: I1203 15:30:40.287214 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2n4v9_ab66f440-24ed-4244-a972-63eee27b67b1/kube-rbac-proxy/0.log" Dec 03 15:30:40 crc kubenswrapper[4751]: I1203 15:30:40.330513 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2n4v9_ab66f440-24ed-4244-a972-63eee27b67b1/machine-api-operator/0.log" Dec 03 15:30:41 crc kubenswrapper[4751]: I1203 15:30:41.235092 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:41 crc kubenswrapper[4751]: I1203 15:30:41.235185 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:42 crc kubenswrapper[4751]: I1203 15:30:42.296831 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gst75" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerName="registry-server" probeResult="failure" output=< Dec 03 15:30:42 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Dec 03 15:30:42 crc kubenswrapper[4751]: > Dec 03 15:30:46 crc kubenswrapper[4751]: I1203 15:30:46.313843 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:30:46 crc kubenswrapper[4751]: E1203 15:30:46.314575 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.683703 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6cmx5"] Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.687248 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.696594 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cmx5"] Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.807612 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-utilities\") pod \"community-operators-6cmx5\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.807723 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lzqj\" (UniqueName: \"kubernetes.io/projected/ba94b979-b398-4fad-92ec-0d073d32f036-kube-api-access-7lzqj\") pod \"community-operators-6cmx5\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.808235 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-catalog-content\") pod \"community-operators-6cmx5\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.910150 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-catalog-content\") pod \"community-operators-6cmx5\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.910212 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-utilities\") pod \"community-operators-6cmx5\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.910270 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lzqj\" (UniqueName: \"kubernetes.io/projected/ba94b979-b398-4fad-92ec-0d073d32f036-kube-api-access-7lzqj\") pod \"community-operators-6cmx5\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.910889 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-catalog-content\") pod \"community-operators-6cmx5\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.910906 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-utilities\") pod \"community-operators-6cmx5\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:48 crc kubenswrapper[4751]: I1203 15:30:48.931860 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lzqj\" (UniqueName: \"kubernetes.io/projected/ba94b979-b398-4fad-92ec-0d073d32f036-kube-api-access-7lzqj\") pod \"community-operators-6cmx5\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:49 crc kubenswrapper[4751]: I1203 15:30:49.021457 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:49 crc kubenswrapper[4751]: I1203 15:30:49.546792 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cmx5"] Dec 03 15:30:50 crc kubenswrapper[4751]: I1203 15:30:50.072454 4751 generic.go:334] "Generic (PLEG): container finished" podID="ba94b979-b398-4fad-92ec-0d073d32f036" containerID="55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4" exitCode=0 Dec 03 15:30:50 crc kubenswrapper[4751]: I1203 15:30:50.072529 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cmx5" event={"ID":"ba94b979-b398-4fad-92ec-0d073d32f036","Type":"ContainerDied","Data":"55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4"} Dec 03 15:30:50 crc kubenswrapper[4751]: I1203 15:30:50.072837 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cmx5" event={"ID":"ba94b979-b398-4fad-92ec-0d073d32f036","Type":"ContainerStarted","Data":"b20a280176d730515168d50fde220e36f3bc1a0b6dd4316f1c654317b1d09aa2"} Dec 03 15:30:51 crc kubenswrapper[4751]: I1203 15:30:51.085770 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cmx5" event={"ID":"ba94b979-b398-4fad-92ec-0d073d32f036","Type":"ContainerStarted","Data":"b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26"} Dec 03 15:30:51 crc kubenswrapper[4751]: I1203 15:30:51.287783 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:51 crc kubenswrapper[4751]: I1203 15:30:51.341417 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:53 crc kubenswrapper[4751]: I1203 15:30:53.104984 4751 generic.go:334] "Generic (PLEG): container finished" podID="ba94b979-b398-4fad-92ec-0d073d32f036" containerID="b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26" exitCode=0 Dec 03 15:30:53 crc kubenswrapper[4751]: I1203 15:30:53.105094 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cmx5" event={"ID":"ba94b979-b398-4fad-92ec-0d073d32f036","Type":"ContainerDied","Data":"b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26"} Dec 03 15:30:53 crc kubenswrapper[4751]: I1203 15:30:53.675603 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gst75"] Dec 03 15:30:53 crc kubenswrapper[4751]: I1203 15:30:53.676266 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gst75" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerName="registry-server" containerID="cri-o://613e3da38951b1ed47e8927524e9be9560d46c395afab0d4cb5320d8cb8e798d" gracePeriod=2 Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.124865 4751 generic.go:334] "Generic (PLEG): container finished" podID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerID="613e3da38951b1ed47e8927524e9be9560d46c395afab0d4cb5320d8cb8e798d" exitCode=0 Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.125472 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gst75" event={"ID":"f550ad55-18d0-438c-8519-27ae87d3bf4c","Type":"ContainerDied","Data":"613e3da38951b1ed47e8927524e9be9560d46c395afab0d4cb5320d8cb8e798d"} Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.136765 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cmx5" event={"ID":"ba94b979-b398-4fad-92ec-0d073d32f036","Type":"ContainerStarted","Data":"fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b"} Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.289591 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.321188 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6cmx5" podStartSLOduration=2.864022591 podStartE2EDuration="6.321162401s" podCreationTimestamp="2025-12-03 15:30:48 +0000 UTC" firstStartedPulling="2025-12-03 15:30:50.074593019 +0000 UTC m=+4657.062948236" lastFinishedPulling="2025-12-03 15:30:53.531732829 +0000 UTC m=+4660.520088046" observedRunningTime="2025-12-03 15:30:54.17056139 +0000 UTC m=+4661.158916627" watchObservedRunningTime="2025-12-03 15:30:54.321162401 +0000 UTC m=+4661.309517618" Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.454498 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-utilities\") pod \"f550ad55-18d0-438c-8519-27ae87d3bf4c\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.454678 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s245\" (UniqueName: \"kubernetes.io/projected/f550ad55-18d0-438c-8519-27ae87d3bf4c-kube-api-access-6s245\") pod \"f550ad55-18d0-438c-8519-27ae87d3bf4c\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.454902 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-catalog-content\") pod \"f550ad55-18d0-438c-8519-27ae87d3bf4c\" (UID: \"f550ad55-18d0-438c-8519-27ae87d3bf4c\") " Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.455249 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-utilities" (OuterVolumeSpecName: "utilities") pod "f550ad55-18d0-438c-8519-27ae87d3bf4c" (UID: "f550ad55-18d0-438c-8519-27ae87d3bf4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.455636 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.460524 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f550ad55-18d0-438c-8519-27ae87d3bf4c-kube-api-access-6s245" (OuterVolumeSpecName: "kube-api-access-6s245") pod "f550ad55-18d0-438c-8519-27ae87d3bf4c" (UID: "f550ad55-18d0-438c-8519-27ae87d3bf4c"). InnerVolumeSpecName "kube-api-access-6s245". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.557999 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s245\" (UniqueName: \"kubernetes.io/projected/f550ad55-18d0-438c-8519-27ae87d3bf4c-kube-api-access-6s245\") on node \"crc\" DevicePath \"\"" Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.579642 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f550ad55-18d0-438c-8519-27ae87d3bf4c" (UID: "f550ad55-18d0-438c-8519-27ae87d3bf4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:30:54 crc kubenswrapper[4751]: I1203 15:30:54.659804 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f550ad55-18d0-438c-8519-27ae87d3bf4c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.179538 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gst75" event={"ID":"f550ad55-18d0-438c-8519-27ae87d3bf4c","Type":"ContainerDied","Data":"6951a67d47697000ebe59342e5c6d1b667614910b74b7fe4c34cea4c938b3490"} Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.181433 4751 scope.go:117] "RemoveContainer" containerID="613e3da38951b1ed47e8927524e9be9560d46c395afab0d4cb5320d8cb8e798d" Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.180643 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gst75" Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.206230 4751 scope.go:117] "RemoveContainer" containerID="4e8c0be2f78cf767b731278c5fa878148832244d6f29752a48e1ae1833388921" Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.240205 4751 scope.go:117] "RemoveContainer" containerID="6bc87d8ae62dc955cbbca23d5722cfd487472f5699289db434b257d1753cac6d" Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.246358 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gst75"] Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.258741 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gst75"] Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.330610 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" path="/var/lib/kubelet/pods/f550ad55-18d0-438c-8519-27ae87d3bf4c/volumes" Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.419088 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-wb55f_6bc5bb21-6b5f-4b06-a96a-2e5883752c9a/cert-manager-controller/0.log" Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.625778 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-rwllx_c09475fb-946d-45c4-8482-4db508ae7459/cert-manager-cainjector/0.log" Dec 03 15:30:55 crc kubenswrapper[4751]: I1203 15:30:55.717750 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-7jwb7_d42519d9-b7e8-4c0b-bcac-b4269faf605a/cert-manager-webhook/0.log" Dec 03 15:30:57 crc kubenswrapper[4751]: I1203 15:30:57.314650 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:30:57 crc kubenswrapper[4751]: E1203 15:30:57.314913 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:30:59 crc kubenswrapper[4751]: I1203 15:30:59.021727 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:59 crc kubenswrapper[4751]: I1203 15:30:59.022539 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:59 crc kubenswrapper[4751]: I1203 15:30:59.081305 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:59 crc kubenswrapper[4751]: I1203 15:30:59.276783 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:30:59 crc kubenswrapper[4751]: I1203 15:30:59.475549 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6cmx5"] Dec 03 15:31:01 crc kubenswrapper[4751]: I1203 15:31:01.234115 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6cmx5" podUID="ba94b979-b398-4fad-92ec-0d073d32f036" containerName="registry-server" containerID="cri-o://fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b" gracePeriod=2 Dec 03 15:31:01 crc kubenswrapper[4751]: I1203 15:31:01.978771 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.113590 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-utilities\") pod \"ba94b979-b398-4fad-92ec-0d073d32f036\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.113823 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-catalog-content\") pod \"ba94b979-b398-4fad-92ec-0d073d32f036\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.113934 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lzqj\" (UniqueName: \"kubernetes.io/projected/ba94b979-b398-4fad-92ec-0d073d32f036-kube-api-access-7lzqj\") pod \"ba94b979-b398-4fad-92ec-0d073d32f036\" (UID: \"ba94b979-b398-4fad-92ec-0d073d32f036\") " Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.114447 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-utilities" (OuterVolumeSpecName: "utilities") pod "ba94b979-b398-4fad-92ec-0d073d32f036" (UID: "ba94b979-b398-4fad-92ec-0d073d32f036"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.114832 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.120088 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba94b979-b398-4fad-92ec-0d073d32f036-kube-api-access-7lzqj" (OuterVolumeSpecName: "kube-api-access-7lzqj") pod "ba94b979-b398-4fad-92ec-0d073d32f036" (UID: "ba94b979-b398-4fad-92ec-0d073d32f036"). InnerVolumeSpecName "kube-api-access-7lzqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.178931 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba94b979-b398-4fad-92ec-0d073d32f036" (UID: "ba94b979-b398-4fad-92ec-0d073d32f036"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.216835 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba94b979-b398-4fad-92ec-0d073d32f036-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.216872 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lzqj\" (UniqueName: \"kubernetes.io/projected/ba94b979-b398-4fad-92ec-0d073d32f036-kube-api-access-7lzqj\") on node \"crc\" DevicePath \"\"" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.247431 4751 generic.go:334] "Generic (PLEG): container finished" podID="ba94b979-b398-4fad-92ec-0d073d32f036" containerID="fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b" exitCode=0 Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.247482 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cmx5" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.247482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cmx5" event={"ID":"ba94b979-b398-4fad-92ec-0d073d32f036","Type":"ContainerDied","Data":"fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b"} Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.247604 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cmx5" event={"ID":"ba94b979-b398-4fad-92ec-0d073d32f036","Type":"ContainerDied","Data":"b20a280176d730515168d50fde220e36f3bc1a0b6dd4316f1c654317b1d09aa2"} Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.247627 4751 scope.go:117] "RemoveContainer" containerID="fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.285028 4751 scope.go:117] "RemoveContainer" containerID="b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.291051 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6cmx5"] Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.302682 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6cmx5"] Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.304293 4751 scope.go:117] "RemoveContainer" containerID="55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.370257 4751 scope.go:117] "RemoveContainer" containerID="fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b" Dec 03 15:31:02 crc kubenswrapper[4751]: E1203 15:31:02.370931 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b\": container with ID starting with fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b not found: ID does not exist" containerID="fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.370969 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b"} err="failed to get container status \"fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b\": rpc error: code = NotFound desc = could not find container \"fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b\": container with ID starting with fdd8675b24444bc6dd2db5ecfeab2ec74fca7d4b726d4a012297b27e6a7a036b not found: ID does not exist" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.370997 4751 scope.go:117] "RemoveContainer" containerID="b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26" Dec 03 15:31:02 crc kubenswrapper[4751]: E1203 15:31:02.371387 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26\": container with ID starting with b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26 not found: ID does not exist" containerID="b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.371408 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26"} err="failed to get container status \"b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26\": rpc error: code = NotFound desc = could not find container \"b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26\": container with ID starting with b0e75f3603754c3f66010e2eb7d841518f6e7223e6ec9b0342e21e9b645fac26 not found: ID does not exist" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.371422 4751 scope.go:117] "RemoveContainer" containerID="55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4" Dec 03 15:31:02 crc kubenswrapper[4751]: E1203 15:31:02.371674 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4\": container with ID starting with 55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4 not found: ID does not exist" containerID="55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4" Dec 03 15:31:02 crc kubenswrapper[4751]: I1203 15:31:02.371701 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4"} err="failed to get container status \"55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4\": rpc error: code = NotFound desc = could not find container \"55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4\": container with ID starting with 55936728ab22c5d7a5edc88cd7c6fe4c65756bc99da29af7689b3d6ec08c94a4 not found: ID does not exist" Dec 03 15:31:03 crc kubenswrapper[4751]: I1203 15:31:03.327404 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba94b979-b398-4fad-92ec-0d073d32f036" path="/var/lib/kubelet/pods/ba94b979-b398-4fad-92ec-0d073d32f036/volumes" Dec 03 15:31:09 crc kubenswrapper[4751]: I1203 15:31:09.688318 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-bjml6_40a56b56-06b1-4640-b817-4b22a08cdfea/nmstate-console-plugin/0.log" Dec 03 15:31:09 crc kubenswrapper[4751]: I1203 15:31:09.864752 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-znb2g_f41e25fd-38de-45a7-95dd-d0172caa1353/nmstate-handler/0.log" Dec 03 15:31:09 crc kubenswrapper[4751]: I1203 15:31:09.923209 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-zmfgm_2dfea938-9795-4c3d-a42c-f4c7cbe57dae/kube-rbac-proxy/0.log" Dec 03 15:31:10 crc kubenswrapper[4751]: I1203 15:31:10.074801 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-zmfgm_2dfea938-9795-4c3d-a42c-f4c7cbe57dae/nmstate-metrics/0.log" Dec 03 15:31:10 crc kubenswrapper[4751]: I1203 15:31:10.151206 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-4xzfk_65d126c6-b570-490c-bae2-71a3a7fa0832/nmstate-operator/0.log" Dec 03 15:31:10 crc kubenswrapper[4751]: I1203 15:31:10.314506 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:31:10 crc kubenswrapper[4751]: E1203 15:31:10.314949 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:31:10 crc kubenswrapper[4751]: I1203 15:31:10.340786 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-7hdqr_d0b754bd-d0ac-42e5-87a1-6f4132d926a9/nmstate-webhook/0.log" Dec 03 15:31:24 crc kubenswrapper[4751]: I1203 15:31:24.868612 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/kube-rbac-proxy/0.log" Dec 03 15:31:24 crc kubenswrapper[4751]: I1203 15:31:24.872110 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/manager/1.log" Dec 03 15:31:25 crc kubenswrapper[4751]: I1203 15:31:25.096792 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/manager/0.log" Dec 03 15:31:25 crc kubenswrapper[4751]: I1203 15:31:25.313962 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:31:25 crc kubenswrapper[4751]: E1203 15:31:25.314303 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:31:40 crc kubenswrapper[4751]: I1203 15:31:40.313591 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:31:40 crc kubenswrapper[4751]: E1203 15:31:40.314296 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:31:40 crc kubenswrapper[4751]: I1203 15:31:40.937231 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-92ghf_4cd8b40c-f374-4b29-96a6-94137d11fe90/kube-rbac-proxy/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.170116 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-92ghf_4cd8b40c-f374-4b29-96a6-94137d11fe90/controller/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.220443 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-frr-files/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.503712 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-metrics/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.549990 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-reloader/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.552167 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-frr-files/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.555267 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-reloader/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.716846 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-reloader/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.722761 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-frr-files/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.760846 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-metrics/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.774899 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-metrics/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.956193 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-frr-files/0.log" Dec 03 15:31:41 crc kubenswrapper[4751]: I1203 15:31:41.971362 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-reloader/0.log" Dec 03 15:31:42 crc kubenswrapper[4751]: I1203 15:31:42.205094 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/cp-metrics/0.log" Dec 03 15:31:42 crc kubenswrapper[4751]: I1203 15:31:42.210033 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/controller/0.log" Dec 03 15:31:42 crc kubenswrapper[4751]: I1203 15:31:42.451481 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/kube-rbac-proxy-frr/0.log" Dec 03 15:31:42 crc kubenswrapper[4751]: I1203 15:31:42.475256 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/frr-metrics/0.log" Dec 03 15:31:42 crc kubenswrapper[4751]: I1203 15:31:42.481969 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/kube-rbac-proxy/0.log" Dec 03 15:31:42 crc kubenswrapper[4751]: I1203 15:31:42.709793 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-9zqk5_0d711f1f-ab39-4d20-951b-398bd5c7226c/frr-k8s-webhook-server/0.log" Dec 03 15:31:42 crc kubenswrapper[4751]: I1203 15:31:42.737177 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/reloader/0.log" Dec 03 15:31:43 crc kubenswrapper[4751]: I1203 15:31:43.011406 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75cdb5998d-hbntt_14960a87-3612-433e-bd1e-b548b0118a2c/manager/1.log" Dec 03 15:31:43 crc kubenswrapper[4751]: I1203 15:31:43.041824 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75cdb5998d-hbntt_14960a87-3612-433e-bd1e-b548b0118a2c/manager/0.log" Dec 03 15:31:43 crc kubenswrapper[4751]: I1203 15:31:43.259350 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7cf457fc-2z2zk_ab9be016-ca60-474a-85d3-7c3ca149e87d/webhook-server/0.log" Dec 03 15:31:43 crc kubenswrapper[4751]: I1203 15:31:43.537254 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rgj2q_3afabeea-33c4-4bed-a2ca-440c78ff75ad/kube-rbac-proxy/0.log" Dec 03 15:31:43 crc kubenswrapper[4751]: I1203 15:31:43.935981 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4fx8c_20dc27ab-4ebf-46d7-8f6e-8c703e66fa92/frr/0.log" Dec 03 15:31:44 crc kubenswrapper[4751]: I1203 15:31:44.088525 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rgj2q_3afabeea-33c4-4bed-a2ca-440c78ff75ad/speaker/0.log" Dec 03 15:31:53 crc kubenswrapper[4751]: I1203 15:31:53.322271 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:31:53 crc kubenswrapper[4751]: E1203 15:31:53.323113 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:31:58 crc kubenswrapper[4751]: I1203 15:31:58.572944 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/util/0.log" Dec 03 15:31:59 crc kubenswrapper[4751]: I1203 15:31:59.392490 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/util/0.log" Dec 03 15:31:59 crc kubenswrapper[4751]: I1203 15:31:59.435443 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/pull/0.log" Dec 03 15:31:59 crc kubenswrapper[4751]: I1203 15:31:59.450930 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/pull/0.log" Dec 03 15:31:59 crc kubenswrapper[4751]: I1203 15:31:59.615618 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/util/0.log" Dec 03 15:31:59 crc kubenswrapper[4751]: I1203 15:31:59.663533 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/extract/0.log" Dec 03 15:31:59 crc kubenswrapper[4751]: I1203 15:31:59.697397 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab6942rjct_9a93d622-3d27-473f-92bb-ffe7b9ec4239/pull/0.log" Dec 03 15:31:59 crc kubenswrapper[4751]: I1203 15:31:59.842300 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/util/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.014724 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/util/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.033972 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/pull/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.070471 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/pull/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.214209 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/util/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.225283 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/pull/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.251821 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303pxsp6_5be1a950-9285-46c8-af53-976abeddd5fb/extract/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.412597 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/util/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.616540 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/pull/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.622839 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/util/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.657905 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/pull/0.log" Dec 03 15:32:00 crc kubenswrapper[4751]: I1203 15:32:00.823680 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/util/0.log" Dec 03 15:32:01 crc kubenswrapper[4751]: I1203 15:32:01.450162 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/pull/0.log" Dec 03 15:32:01 crc kubenswrapper[4751]: I1203 15:32:01.463876 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwv4r9_b30158de-69d4-4a93-9952-9b61fd08e5cd/extract/0.log" Dec 03 15:32:01 crc kubenswrapper[4751]: I1203 15:32:01.528853 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/util/0.log" Dec 03 15:32:01 crc kubenswrapper[4751]: I1203 15:32:01.701961 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/pull/0.log" Dec 03 15:32:01 crc kubenswrapper[4751]: I1203 15:32:01.720148 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/util/0.log" Dec 03 15:32:01 crc kubenswrapper[4751]: I1203 15:32:01.781015 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/pull/0.log" Dec 03 15:32:01 crc kubenswrapper[4751]: I1203 15:32:01.990221 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/util/0.log" Dec 03 15:32:01 crc kubenswrapper[4751]: I1203 15:32:01.991213 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/pull/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.053058 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92109rkpj_f6516bb2-c6eb-464d-a533-03917cbf52e4/extract/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.207619 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/util/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.343605 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/util/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.405560 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/pull/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.439813 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/pull/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.615749 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/util/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.655347 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-utilities/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.665074 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/pull/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.670090 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83m55n9_abecd931-d6b1-4ee6-83dc-eb78d75c076c/extract/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.848092 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-utilities/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.869830 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-content/0.log" Dec 03 15:32:02 crc kubenswrapper[4751]: I1203 15:32:02.919309 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-content/0.log" Dec 03 15:32:03 crc kubenswrapper[4751]: I1203 15:32:03.379568 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-content/0.log" Dec 03 15:32:03 crc kubenswrapper[4751]: I1203 15:32:03.432449 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/extract-utilities/0.log" Dec 03 15:32:03 crc kubenswrapper[4751]: I1203 15:32:03.466194 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-utilities/0.log" Dec 03 15:32:03 crc kubenswrapper[4751]: I1203 15:32:03.677488 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-utilities/0.log" Dec 03 15:32:03 crc kubenswrapper[4751]: I1203 15:32:03.717946 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-content/0.log" Dec 03 15:32:03 crc kubenswrapper[4751]: I1203 15:32:03.746193 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-content/0.log" Dec 03 15:32:03 crc kubenswrapper[4751]: I1203 15:32:03.949240 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p6v9f_78e1107f-c2a3-4dd7-b6f9-af9729fea0a3/registry-server/0.log" Dec 03 15:32:03 crc kubenswrapper[4751]: I1203 15:32:03.967075 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-content/0.log" Dec 03 15:32:03 crc kubenswrapper[4751]: I1203 15:32:03.998567 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/extract-utilities/0.log" Dec 03 15:32:04 crc kubenswrapper[4751]: I1203 15:32:04.284087 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-sq9k5_224b9e4a-5a71-4559-84b6-9599c2dfd321/marketplace-operator/0.log" Dec 03 15:32:04 crc kubenswrapper[4751]: I1203 15:32:04.330934 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-utilities/0.log" Dec 03 15:32:04 crc kubenswrapper[4751]: I1203 15:32:04.634571 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-content/0.log" Dec 03 15:32:04 crc kubenswrapper[4751]: I1203 15:32:04.650646 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-utilities/0.log" Dec 03 15:32:04 crc kubenswrapper[4751]: I1203 15:32:04.657469 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-content/0.log" Dec 03 15:32:04 crc kubenswrapper[4751]: I1203 15:32:04.802718 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dn4vn_4c16a038-8221-4d8e-b455-e02c4be1c751/registry-server/0.log" Dec 03 15:32:04 crc kubenswrapper[4751]: I1203 15:32:04.938539 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-content/0.log" Dec 03 15:32:04 crc kubenswrapper[4751]: I1203 15:32:04.979456 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/extract-utilities/0.log" Dec 03 15:32:05 crc kubenswrapper[4751]: I1203 15:32:05.046367 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-utilities/0.log" Dec 03 15:32:05 crc kubenswrapper[4751]: I1203 15:32:05.141212 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z8ldt_d9f96a5f-adfc-467c-91e7-631517b599a2/registry-server/0.log" Dec 03 15:32:05 crc kubenswrapper[4751]: I1203 15:32:05.237617 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-utilities/0.log" Dec 03 15:32:05 crc kubenswrapper[4751]: I1203 15:32:05.267880 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-content/0.log" Dec 03 15:32:05 crc kubenswrapper[4751]: I1203 15:32:05.278309 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-content/0.log" Dec 03 15:32:05 crc kubenswrapper[4751]: I1203 15:32:05.486833 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-utilities/0.log" Dec 03 15:32:05 crc kubenswrapper[4751]: I1203 15:32:05.505984 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/extract-content/0.log" Dec 03 15:32:06 crc kubenswrapper[4751]: I1203 15:32:06.141654 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bjcgn_fd5bafe9-858d-4112-ae58-8ad005161e3d/registry-server/0.log" Dec 03 15:32:07 crc kubenswrapper[4751]: I1203 15:32:07.314742 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:32:07 crc kubenswrapper[4751]: E1203 15:32:07.315088 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:32:18 crc kubenswrapper[4751]: I1203 15:32:18.269888 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-96kch_2cc6a6aa-cb86-448e-8fae-8e1dc45c89ca/prometheus-operator/0.log" Dec 03 15:32:18 crc kubenswrapper[4751]: I1203 15:32:18.458837 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57bb6f4767-9frfm_7e248019-bf73-4c6d-a551-6c62dcf6ec11/prometheus-operator-admission-webhook/0.log" Dec 03 15:32:18 crc kubenswrapper[4751]: I1203 15:32:18.533357 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57bb6f4767-g2nlm_01521e70-1366-4e52-9f9a-885522387a0e/prometheus-operator-admission-webhook/0.log" Dec 03 15:32:18 crc kubenswrapper[4751]: I1203 15:32:18.733099 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-hq59z_bb80946e-134f-4baa-b150-6004a9313de9/perses-operator/0.log" Dec 03 15:32:18 crc kubenswrapper[4751]: I1203 15:32:18.743402 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-rgl9w_216e104c-e7e3-4be4-972c-cd524973eaa6/operator/0.log" Dec 03 15:32:19 crc kubenswrapper[4751]: I1203 15:32:19.314853 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:32:19 crc kubenswrapper[4751]: E1203 15:32:19.315308 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:32:32 crc kubenswrapper[4751]: I1203 15:32:32.315787 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:32:32 crc kubenswrapper[4751]: E1203 15:32:32.316975 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:32:32 crc kubenswrapper[4751]: I1203 15:32:32.829671 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/manager/0.log" Dec 03 15:32:32 crc kubenswrapper[4751]: I1203 15:32:32.920854 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/kube-rbac-proxy/0.log" Dec 03 15:32:32 crc kubenswrapper[4751]: I1203 15:32:32.941553 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-766794d8b8-zzghr_b44b61b1-5d61-4f3c-b4bc-a1a1d9c44238/manager/1.log" Dec 03 15:32:47 crc kubenswrapper[4751]: I1203 15:32:47.314196 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:32:47 crc kubenswrapper[4751]: E1203 15:32:47.314827 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:33:00 crc kubenswrapper[4751]: I1203 15:33:00.314107 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:33:00 crc kubenswrapper[4751]: E1203 15:33:00.315072 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:33:14 crc kubenswrapper[4751]: I1203 15:33:14.314102 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:33:14 crc kubenswrapper[4751]: E1203 15:33:14.314966 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:33:28 crc kubenswrapper[4751]: I1203 15:33:28.314247 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:33:28 crc kubenswrapper[4751]: E1203 15:33:28.315143 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:33:43 crc kubenswrapper[4751]: I1203 15:33:43.322110 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:33:43 crc kubenswrapper[4751]: E1203 15:33:43.322964 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:33:57 crc kubenswrapper[4751]: I1203 15:33:57.314064 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:33:57 crc kubenswrapper[4751]: E1203 15:33:57.314922 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:34:11 crc kubenswrapper[4751]: I1203 15:34:11.314063 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:34:11 crc kubenswrapper[4751]: E1203 15:34:11.314909 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:34:21 crc kubenswrapper[4751]: I1203 15:34:21.407824 4751 generic.go:334] "Generic (PLEG): container finished" podID="70603fe9-149b-4400-b573-52e0a3ecc142" containerID="881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6" exitCode=0 Dec 03 15:34:21 crc kubenswrapper[4751]: I1203 15:34:21.407997 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrqk/must-gather-twrb7" event={"ID":"70603fe9-149b-4400-b573-52e0a3ecc142","Type":"ContainerDied","Data":"881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6"} Dec 03 15:34:21 crc kubenswrapper[4751]: I1203 15:34:21.408948 4751 scope.go:117] "RemoveContainer" containerID="881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6" Dec 03 15:34:22 crc kubenswrapper[4751]: I1203 15:34:22.476011 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jhrqk_must-gather-twrb7_70603fe9-149b-4400-b573-52e0a3ecc142/gather/0.log" Dec 03 15:34:25 crc kubenswrapper[4751]: I1203 15:34:25.313906 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:34:25 crc kubenswrapper[4751]: E1203 15:34:25.314440 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djf67_openshift-machine-config-operator(385620eb-744d-423e-b02b-1274f3075689)\"" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" Dec 03 15:34:30 crc kubenswrapper[4751]: I1203 15:34:30.153311 4751 scope.go:117] "RemoveContainer" containerID="54b6194ac866a8d6a0f7810ce58fd1e1d781ee6bca81fe855009dae8acf16820" Dec 03 15:34:33 crc kubenswrapper[4751]: I1203 15:34:33.838983 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jhrqk/must-gather-twrb7"] Dec 03 15:34:33 crc kubenswrapper[4751]: I1203 15:34:33.839810 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jhrqk/must-gather-twrb7" podUID="70603fe9-149b-4400-b573-52e0a3ecc142" containerName="copy" containerID="cri-o://12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0" gracePeriod=2 Dec 03 15:34:33 crc kubenswrapper[4751]: I1203 15:34:33.850753 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jhrqk/must-gather-twrb7"] Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.450068 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jhrqk_must-gather-twrb7_70603fe9-149b-4400-b573-52e0a3ecc142/copy/0.log" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.451824 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.557312 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jhrqk_must-gather-twrb7_70603fe9-149b-4400-b573-52e0a3ecc142/copy/0.log" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.558077 4751 generic.go:334] "Generic (PLEG): container finished" podID="70603fe9-149b-4400-b573-52e0a3ecc142" containerID="12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0" exitCode=143 Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.558124 4751 scope.go:117] "RemoveContainer" containerID="12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.558147 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrqk/must-gather-twrb7" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.580025 4751 scope.go:117] "RemoveContainer" containerID="881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.584396 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnx6j\" (UniqueName: \"kubernetes.io/projected/70603fe9-149b-4400-b573-52e0a3ecc142-kube-api-access-lnx6j\") pod \"70603fe9-149b-4400-b573-52e0a3ecc142\" (UID: \"70603fe9-149b-4400-b573-52e0a3ecc142\") " Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.584831 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70603fe9-149b-4400-b573-52e0a3ecc142-must-gather-output\") pod \"70603fe9-149b-4400-b573-52e0a3ecc142\" (UID: \"70603fe9-149b-4400-b573-52e0a3ecc142\") " Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.604363 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70603fe9-149b-4400-b573-52e0a3ecc142-kube-api-access-lnx6j" (OuterVolumeSpecName: "kube-api-access-lnx6j") pod "70603fe9-149b-4400-b573-52e0a3ecc142" (UID: "70603fe9-149b-4400-b573-52e0a3ecc142"). InnerVolumeSpecName "kube-api-access-lnx6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.629053 4751 scope.go:117] "RemoveContainer" containerID="12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0" Dec 03 15:34:34 crc kubenswrapper[4751]: E1203 15:34:34.629711 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0\": container with ID starting with 12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0 not found: ID does not exist" containerID="12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.629752 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0"} err="failed to get container status \"12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0\": rpc error: code = NotFound desc = could not find container \"12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0\": container with ID starting with 12fb22ec0f386f7bf04cc4b943b3c082780e8a4acf223eb61261c9a61c1081c0 not found: ID does not exist" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.629779 4751 scope.go:117] "RemoveContainer" containerID="881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6" Dec 03 15:34:34 crc kubenswrapper[4751]: E1203 15:34:34.630174 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6\": container with ID starting with 881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6 not found: ID does not exist" containerID="881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.630196 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6"} err="failed to get container status \"881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6\": rpc error: code = NotFound desc = could not find container \"881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6\": container with ID starting with 881fe3a7497afbccc42395af2e03f4a616c0edb0e6699e38daebbc7ff55471e6 not found: ID does not exist" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.690773 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnx6j\" (UniqueName: \"kubernetes.io/projected/70603fe9-149b-4400-b573-52e0a3ecc142-kube-api-access-lnx6j\") on node \"crc\" DevicePath \"\"" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.833107 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70603fe9-149b-4400-b573-52e0a3ecc142-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "70603fe9-149b-4400-b573-52e0a3ecc142" (UID: "70603fe9-149b-4400-b573-52e0a3ecc142"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:34:34 crc kubenswrapper[4751]: I1203 15:34:34.904544 4751 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70603fe9-149b-4400-b573-52e0a3ecc142-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 15:34:35 crc kubenswrapper[4751]: I1203 15:34:35.326130 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70603fe9-149b-4400-b573-52e0a3ecc142" path="/var/lib/kubelet/pods/70603fe9-149b-4400-b573-52e0a3ecc142/volumes" Dec 03 15:34:40 crc kubenswrapper[4751]: I1203 15:34:40.315080 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642" Dec 03 15:34:41 crc kubenswrapper[4751]: I1203 15:34:41.633852 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"6ef512e660f4bd10cf8d26ee763e1f9a04f227b42d11a27077b7d84646ae0766"} Dec 03 15:35:30 crc kubenswrapper[4751]: I1203 15:35:30.311288 4751 scope.go:117] "RemoveContainer" containerID="09dce35f2e2b5c453fe157d4107c5b2e2715c79bb3837fe974e833288a83f6ee" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.960923 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g8lp9"] Dec 03 15:37:00 crc kubenswrapper[4751]: E1203 15:37:00.961910 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70603fe9-149b-4400-b573-52e0a3ecc142" containerName="gather" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.961926 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="70603fe9-149b-4400-b573-52e0a3ecc142" containerName="gather" Dec 03 15:37:00 crc kubenswrapper[4751]: E1203 15:37:00.961948 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerName="extract-utilities" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.961957 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerName="extract-utilities" Dec 03 15:37:00 crc kubenswrapper[4751]: E1203 15:37:00.961970 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba94b979-b398-4fad-92ec-0d073d32f036" containerName="registry-server" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.961977 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba94b979-b398-4fad-92ec-0d073d32f036" containerName="registry-server" Dec 03 15:37:00 crc kubenswrapper[4751]: E1203 15:37:00.961991 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70603fe9-149b-4400-b573-52e0a3ecc142" containerName="copy" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.962008 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="70603fe9-149b-4400-b573-52e0a3ecc142" containerName="copy" Dec 03 15:37:00 crc kubenswrapper[4751]: E1203 15:37:00.962021 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba94b979-b398-4fad-92ec-0d073d32f036" containerName="extract-content" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.962027 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba94b979-b398-4fad-92ec-0d073d32f036" containerName="extract-content" Dec 03 15:37:00 crc kubenswrapper[4751]: E1203 15:37:00.962052 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerName="extract-content" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.962060 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerName="extract-content" Dec 03 15:37:00 crc kubenswrapper[4751]: E1203 15:37:00.962079 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba94b979-b398-4fad-92ec-0d073d32f036" containerName="extract-utilities" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.962088 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba94b979-b398-4fad-92ec-0d073d32f036" containerName="extract-utilities" Dec 03 15:37:00 crc kubenswrapper[4751]: E1203 15:37:00.962104 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerName="registry-server" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.962111 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerName="registry-server" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.962358 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="70603fe9-149b-4400-b573-52e0a3ecc142" containerName="gather" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.962386 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba94b979-b398-4fad-92ec-0d073d32f036" containerName="registry-server" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.962406 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="70603fe9-149b-4400-b573-52e0a3ecc142" containerName="copy" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.962424 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f550ad55-18d0-438c-8519-27ae87d3bf4c" containerName="registry-server" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.964139 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:00 crc kubenswrapper[4751]: I1203 15:37:00.977268 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8lp9"] Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.100146 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-utilities\") pod \"redhat-marketplace-g8lp9\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.100230 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698rz\" (UniqueName: \"kubernetes.io/projected/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-kube-api-access-698rz\") pod \"redhat-marketplace-g8lp9\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.100300 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-catalog-content\") pod \"redhat-marketplace-g8lp9\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.202607 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698rz\" (UniqueName: \"kubernetes.io/projected/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-kube-api-access-698rz\") pod \"redhat-marketplace-g8lp9\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.202991 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-catalog-content\") pod \"redhat-marketplace-g8lp9\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.203122 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-utilities\") pod \"redhat-marketplace-g8lp9\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.203590 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-catalog-content\") pod \"redhat-marketplace-g8lp9\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.203637 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-utilities\") pod \"redhat-marketplace-g8lp9\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.224921 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698rz\" (UniqueName: \"kubernetes.io/projected/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-kube-api-access-698rz\") pod \"redhat-marketplace-g8lp9\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.285758 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:01 crc kubenswrapper[4751]: I1203 15:37:01.828780 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8lp9"] Dec 03 15:37:01 crc kubenswrapper[4751]: W1203 15:37:01.830676 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d2238a_a2c1_45f6_aa17_c72b7a46ee89.slice/crio-4e63b0b9b7a4bdfb6ddfe30257d48edaf3128893c55a9093913eb6f3bd5d661c WatchSource:0}: Error finding container 4e63b0b9b7a4bdfb6ddfe30257d48edaf3128893c55a9093913eb6f3bd5d661c: Status 404 returned error can't find the container with id 4e63b0b9b7a4bdfb6ddfe30257d48edaf3128893c55a9093913eb6f3bd5d661c Dec 03 15:37:02 crc kubenswrapper[4751]: I1203 15:37:02.090530 4751 generic.go:334] "Generic (PLEG): container finished" podID="27d2238a-a2c1-45f6-aa17-c72b7a46ee89" containerID="730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32" exitCode=0 Dec 03 15:37:02 crc kubenswrapper[4751]: I1203 15:37:02.090649 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp9" event={"ID":"27d2238a-a2c1-45f6-aa17-c72b7a46ee89","Type":"ContainerDied","Data":"730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32"} Dec 03 15:37:02 crc kubenswrapper[4751]: I1203 15:37:02.091095 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp9" event={"ID":"27d2238a-a2c1-45f6-aa17-c72b7a46ee89","Type":"ContainerStarted","Data":"4e63b0b9b7a4bdfb6ddfe30257d48edaf3128893c55a9093913eb6f3bd5d661c"} Dec 03 15:37:02 crc kubenswrapper[4751]: I1203 15:37:02.093048 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 15:37:03 crc kubenswrapper[4751]: I1203 15:37:03.101306 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp9" event={"ID":"27d2238a-a2c1-45f6-aa17-c72b7a46ee89","Type":"ContainerStarted","Data":"9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803"} Dec 03 15:37:04 crc kubenswrapper[4751]: I1203 15:37:04.113391 4751 generic.go:334] "Generic (PLEG): container finished" podID="27d2238a-a2c1-45f6-aa17-c72b7a46ee89" containerID="9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803" exitCode=0 Dec 03 15:37:04 crc kubenswrapper[4751]: I1203 15:37:04.113702 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp9" event={"ID":"27d2238a-a2c1-45f6-aa17-c72b7a46ee89","Type":"ContainerDied","Data":"9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803"} Dec 03 15:37:05 crc kubenswrapper[4751]: I1203 15:37:05.129579 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp9" event={"ID":"27d2238a-a2c1-45f6-aa17-c72b7a46ee89","Type":"ContainerStarted","Data":"4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123"} Dec 03 15:37:05 crc kubenswrapper[4751]: I1203 15:37:05.158472 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g8lp9" podStartSLOduration=2.680635144 podStartE2EDuration="5.158436766s" podCreationTimestamp="2025-12-03 15:37:00 +0000 UTC" firstStartedPulling="2025-12-03 15:37:02.09276068 +0000 UTC m=+5029.081115897" lastFinishedPulling="2025-12-03 15:37:04.570562302 +0000 UTC m=+5031.558917519" observedRunningTime="2025-12-03 15:37:05.145209961 +0000 UTC m=+5032.133565188" watchObservedRunningTime="2025-12-03 15:37:05.158436766 +0000 UTC m=+5032.146792003" Dec 03 15:37:05 crc kubenswrapper[4751]: I1203 15:37:05.820025 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:37:05 crc kubenswrapper[4751]: I1203 15:37:05.820383 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:37:11 crc kubenswrapper[4751]: I1203 15:37:11.285888 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:11 crc kubenswrapper[4751]: I1203 15:37:11.286215 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:11 crc kubenswrapper[4751]: I1203 15:37:11.334157 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:12 crc kubenswrapper[4751]: I1203 15:37:12.247952 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:12 crc kubenswrapper[4751]: I1203 15:37:12.305739 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8lp9"] Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.214847 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g8lp9" podUID="27d2238a-a2c1-45f6-aa17-c72b7a46ee89" containerName="registry-server" containerID="cri-o://4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123" gracePeriod=2 Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.775608 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.894595 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-catalog-content\") pod \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.895269 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-utilities\") pod \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.895654 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-698rz\" (UniqueName: \"kubernetes.io/projected/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-kube-api-access-698rz\") pod \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\" (UID: \"27d2238a-a2c1-45f6-aa17-c72b7a46ee89\") " Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.896174 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-utilities" (OuterVolumeSpecName: "utilities") pod "27d2238a-a2c1-45f6-aa17-c72b7a46ee89" (UID: "27d2238a-a2c1-45f6-aa17-c72b7a46ee89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.897102 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.901891 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-kube-api-access-698rz" (OuterVolumeSpecName: "kube-api-access-698rz") pod "27d2238a-a2c1-45f6-aa17-c72b7a46ee89" (UID: "27d2238a-a2c1-45f6-aa17-c72b7a46ee89"). InnerVolumeSpecName "kube-api-access-698rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.920015 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27d2238a-a2c1-45f6-aa17-c72b7a46ee89" (UID: "27d2238a-a2c1-45f6-aa17-c72b7a46ee89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.998969 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:37:14 crc kubenswrapper[4751]: I1203 15:37:14.999003 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-698rz\" (UniqueName: \"kubernetes.io/projected/27d2238a-a2c1-45f6-aa17-c72b7a46ee89-kube-api-access-698rz\") on node \"crc\" DevicePath \"\"" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.225749 4751 generic.go:334] "Generic (PLEG): container finished" podID="27d2238a-a2c1-45f6-aa17-c72b7a46ee89" containerID="4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123" exitCode=0 Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.225798 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp9" event={"ID":"27d2238a-a2c1-45f6-aa17-c72b7a46ee89","Type":"ContainerDied","Data":"4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123"} Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.225828 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8lp9" event={"ID":"27d2238a-a2c1-45f6-aa17-c72b7a46ee89","Type":"ContainerDied","Data":"4e63b0b9b7a4bdfb6ddfe30257d48edaf3128893c55a9093913eb6f3bd5d661c"} Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.225848 4751 scope.go:117] "RemoveContainer" containerID="4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.226000 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8lp9" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.257393 4751 scope.go:117] "RemoveContainer" containerID="9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.262341 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8lp9"] Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.271544 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8lp9"] Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.277457 4751 scope.go:117] "RemoveContainer" containerID="730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.326412 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27d2238a-a2c1-45f6-aa17-c72b7a46ee89" path="/var/lib/kubelet/pods/27d2238a-a2c1-45f6-aa17-c72b7a46ee89/volumes" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.330054 4751 scope.go:117] "RemoveContainer" containerID="4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123" Dec 03 15:37:15 crc kubenswrapper[4751]: E1203 15:37:15.330819 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123\": container with ID starting with 4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123 not found: ID does not exist" containerID="4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.330864 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123"} err="failed to get container status \"4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123\": rpc error: code = NotFound desc = could not find container \"4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123\": container with ID starting with 4a6f3437b21b2ea966efa489675e8472d23993f9ecfe2549e9e016a2f4ae3123 not found: ID does not exist" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.330897 4751 scope.go:117] "RemoveContainer" containerID="9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803" Dec 03 15:37:15 crc kubenswrapper[4751]: E1203 15:37:15.331441 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803\": container with ID starting with 9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803 not found: ID does not exist" containerID="9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.331491 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803"} err="failed to get container status \"9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803\": rpc error: code = NotFound desc = could not find container \"9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803\": container with ID starting with 9167753fc08ce17c30a65cf5ea101a17f466bca7dcb4a740b8381f86e5c87803 not found: ID does not exist" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.331525 4751 scope.go:117] "RemoveContainer" containerID="730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32" Dec 03 15:37:15 crc kubenswrapper[4751]: E1203 15:37:15.331875 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32\": container with ID starting with 730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32 not found: ID does not exist" containerID="730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32" Dec 03 15:37:15 crc kubenswrapper[4751]: I1203 15:37:15.331899 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32"} err="failed to get container status \"730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32\": rpc error: code = NotFound desc = could not find container \"730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32\": container with ID starting with 730d40ba3c730804968a850b758d0cf090d85bca421bb93fa3eae2a3203f6d32 not found: ID does not exist" Dec 03 15:37:35 crc kubenswrapper[4751]: I1203 15:37:35.820662 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:37:35 crc kubenswrapper[4751]: I1203 15:37:35.822169 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:38:05 crc kubenswrapper[4751]: I1203 15:38:05.820136 4751 patch_prober.go:28] interesting pod/machine-config-daemon-djf67 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:38:05 crc kubenswrapper[4751]: I1203 15:38:05.820674 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:38:05 crc kubenswrapper[4751]: I1203 15:38:05.820719 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djf67" Dec 03 15:38:05 crc kubenswrapper[4751]: I1203 15:38:05.821415 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ef512e660f4bd10cf8d26ee763e1f9a04f227b42d11a27077b7d84646ae0766"} pod="openshift-machine-config-operator/machine-config-daemon-djf67" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:38:05 crc kubenswrapper[4751]: I1203 15:38:05.821457 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djf67" podUID="385620eb-744d-423e-b02b-1274f3075689" containerName="machine-config-daemon" containerID="cri-o://6ef512e660f4bd10cf8d26ee763e1f9a04f227b42d11a27077b7d84646ae0766" gracePeriod=600 Dec 03 15:38:06 crc kubenswrapper[4751]: I1203 15:38:06.778084 4751 generic.go:334] "Generic (PLEG): container finished" podID="385620eb-744d-423e-b02b-1274f3075689" containerID="6ef512e660f4bd10cf8d26ee763e1f9a04f227b42d11a27077b7d84646ae0766" exitCode=0 Dec 03 15:38:06 crc kubenswrapper[4751]: I1203 15:38:06.778252 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerDied","Data":"6ef512e660f4bd10cf8d26ee763e1f9a04f227b42d11a27077b7d84646ae0766"} Dec 03 15:38:06 crc kubenswrapper[4751]: I1203 15:38:06.778830 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djf67" event={"ID":"385620eb-744d-423e-b02b-1274f3075689","Type":"ContainerStarted","Data":"0ae872d6cd53f793136d9aaa39e7583cefe3ec9530b679289c44982fc3ba5af7"} Dec 03 15:38:06 crc kubenswrapper[4751]: I1203 15:38:06.778868 4751 scope.go:117] "RemoveContainer" containerID="1a7d3374c595269f4a9d24e338e5e9eaf99cda12b8a6b0ee51fd39d900f10642"